SuccessConsole Output

Skipping 237 KB.. Full Log
objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeasesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpErrorLog [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubclassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassData [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPrimaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSecondaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSharedNetworkDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeaseDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:21: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20170410-17:59:21: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20170410-17:59:21: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20170410-17:59:21: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20170410-17:59:21: [INFO]: fetching the cache named system [org.apache.directory.server.core.api.CacheService]
20170410-17:59:21: [INFO]: No cache with name system exists, creating one [org.apache.directory.server.core.api.CacheService]
20170410-17:59:22: [INFO]: Keys and self signed certificate successfully generated. [org.apache.directory.server.core.security.TlsKeyGenerator]
20170410-17:59:22: [INFO]: fetching the cache named groupCache [org.apache.directory.server.core.api.CacheService]
20170410-17:59:22: [INFO]: Initializing ... [org.apache.directory.server.core.event.EventInterceptor]
20170410-17:59:22: [INFO]: Initialization complete. [org.apache.directory.server.core.event.EventInterceptor]
20170410-17:59:22: [WARN]: You didn't change the admin password of directory service instance 'org'.  Please update the admin password as soon as possible to prevent a possible security breach. [org.apache.directory.server.core.DefaultDirectoryService]
20170410-17:59:22: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:22: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:22: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:22: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:22: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:23: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170410-17:59:23: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20170410-17:59:23: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20170410-17:59:23: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20170410-17:59:23: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20170410-17:59:23: [INFO]: fetching the cache named org [org.apache.directory.server.core.api.CacheService]
20170410-17:59:23: [INFO]: No cache with name org exists, creating one [org.apache.directory.server.core.api.CacheService]
20170410-17:59:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:23: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20170410-17:59:23: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20170410-17:59:24: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:24: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170410-17:59:27: [INFO]: Successful bind of an LDAP Service (9389) is completed. [org.apache.directory.server.ldap.LdapServer]
20170410-17:59:27: [INFO]: Ldap service started. [org.apache.directory.server.ldap.LdapServer]
20170410-17:59:27: [INFO]: Loading XML bean definitions from class path resource [org/dataone/configuration/testApplicationContext.xml] [org.springframework.beans.factory.xml.XmlBeanDefinitionReader]
20170410-17:59:27: [INFO]: Refreshing org.springframework.context.support.GenericApplicationContext@572153d0: startup date [Mon Apr 10 17:59:27 UTC 2017]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20170410-17:59:27: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@6d187d70: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
Apr 10, 2017 5:59:28 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
Hazelcast Group Config:
GroupConfig [name=DataONEBuildTest, password=*******************]
Hazelcast Maps: hzSystemMetadata hzReplicationTasksMap hzNodes 
Hazelcast Queues: hzReplicationTasks 
Apr 10, 2017 5:59:28 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Apr 10, 2017 5:59:28 PM com.hazelcast.impl.AddressPicker
WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true.
Apr 10, 2017 5:59:28 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5730, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5730], bind any local is true
Apr 10, 2017 5:59:28 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5730
Apr 10, 2017 5:59:28 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Apr 10, 2017 5:59:28 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTING
Apr 10, 2017 5:59:28 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Apr 10, 2017 5:59:28 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Apr 10, 2017 5:59:28 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Apr 10, 2017 5:59:28 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Apr 10, 2017 5:59:29 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Apr 10, 2017 5:59:29 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Apr 10, 2017 5:59:29 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5730 this
}

Apr 10, 2017 5:59:29 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTED
Apr 10, 2017 5:59:29 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Apr 10, 2017 5:59:29 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5731, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5731], bind any local is true
Apr 10, 2017 5:59:29 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5731
Apr 10, 2017 5:59:29 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Apr 10, 2017 5:59:29 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTING
Apr 10, 2017 5:59:29 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Apr 10, 2017 5:59:29 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Apr 10, 2017 5:59:29 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Apr 10, 2017 5:59:29 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:40150
Apr 10, 2017 5:59:29 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 40150 accepted socket connection from /127.0.0.1:5730
Apr 10, 2017 5:59:29 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:40150
Apr 10, 2017 5:59:30 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:30 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Apr 10, 2017 5:59:30 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:35 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:35 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
}

Apr 10, 2017 5:59:35 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
}

Apr 10, 2017 5:59:36 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:37 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTED
Apr 10, 2017 5:59:37 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Apr 10, 2017 5:59:37 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5732, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5732], bind any local is true
Apr 10, 2017 5:59:37 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5732
Apr 10, 2017 5:59:37 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Apr 10, 2017 5:59:37 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTING
Apr 10, 2017 5:59:37 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Apr 10, 2017 5:59:37 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Apr 10, 2017 5:59:37 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:55358
Apr 10, 2017 5:59:37 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 55358 accepted socket connection from /127.0.0.1:5730
Apr 10, 2017 5:59:37 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 37855 accepted socket connection from /127.0.0.1:5731
Apr 10, 2017 5:59:37 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:55358
Apr 10, 2017 5:59:37 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:37855
Apr 10, 2017 5:59:37 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:37855
Apr 10, 2017 5:59:38 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:38 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5731
Apr 10, 2017 5:59:38 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:38 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:43 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:43 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732
}

Apr 10, 2017 5:59:43 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Apr 10, 2017 5:59:43 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Apr 10, 2017 5:59:44 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Apr 10, 2017 5:59:45 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTED
Hazelcast member hzMember name: hzProcessInstance
Hazelcast member h1 name: hzProcessInstance1
Hazelcast member h2 name: hzProcessInstance2
Cluster size 3
hzProcessInstance's InetSocketAddress: /127.0.0.1:5730
hzProcessInstance's InetSocketAddress: /127.0.0.1:5731
hzProcessInstance's InetSocketAddress: /127.0.0.1:5732
Apr 10, 2017 5:59:46 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcastTestClientConf.xml'.
20170410-17:59:46: [INFO]: group DataONEBuildTest addresses 127.0.0.1:5730 [org.dataone.cn.hazelcast.HazelcastClientFactory]
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Apr 10, 2017 5:59:46 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:40774
Apr 10, 2017 5:59:46 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:40774
Apr 10, 2017 5:59:46 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:40774 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Apr 10, 2017 5:59:46 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:40775
Apr 10, 2017 5:59:46 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:40775
Apr 10, 2017 5:59:46 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:40775 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Apr 10, 2017 5:59:46 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
20170410-17:59:46: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: loading into client truststore: java.io.InputStreamReader@6d8fe3d4 [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 0 alias CN=DataONE Root CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 1 alias CN=DataONE Production CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 2 alias CN=CILogon Basic CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 3 alias CN=CILogon OpenID CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 4 alias CN=CILogon Silver CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 5 alias CN=RapidSSL CA,O=GeoTrust\, Inc.,C=US [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 6 alias CN=ISRG Root X1,O=Internet Security Research Group,C=US [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: 7 alias CN=DST Root CA X3,O=Digital Signature Trust Co. [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170410-17:59:46: [WARN]: Starting monitor thread [org.dataone.client.utils.HttpConnectionMonitorService]
20170410-17:59:46: [WARN]: Starting monitoring... [org.dataone.client.utils.HttpConnectionMonitorService]
20170410-17:59:46: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170410-17:59:47: [INFO]: RestClient.doRequestNoBody, thread(1) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v2/node [org.dataone.client.rest.RestClient]
20170410-17:59:47: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20170410-17:59:47: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@250b47ee: startup date [Mon Apr 10 17:59:47 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
20170410-17:59:47: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:47: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:47: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:47: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:47: [INFO]: Creating embedded database 'testdb' [org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory]
20170410-17:59:47: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20170410-17:59:47: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20170410-17:59:47: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20170410-17:59:47: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20170410-17:59:48: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20170410-17:59:48: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20170410-17:59:48: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20170410-17:59:48: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20170410-17:59:48: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20170410-17:59:48: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20170410-17:59:48: [INFO]: Using dialect: org.hibernate.dialect.H2Dialect [org.hibernate.dialect.Dialect]
20170410-17:59:48: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20170410-17:59:48: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20170410-17:59:48: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20170410-17:59:48: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20170410-17:59:48: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20170410-17:59:48: [INFO]: Type registration [materialized_clob] overrides previous : org.hibernate.type.MaterializedClobType@5d1eeb3f [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [wrapper_materialized_blob] overrides previous : org.hibernate.type.WrappedMaterializedBlobType@18e6a4dc [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@4e517449 [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [wrapper_characters_clob] overrides previous : org.hibernate.type.CharacterArrayClobType@45a98cca [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [blob] overrides previous : org.hibernate.type.BlobType@7e5a4580 [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [java.sql.Blob] overrides previous : org.hibernate.type.BlobType@7e5a4580 [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [characters_clob] overrides previous : org.hibernate.type.PrimitiveCharacterArrayClobType@5889174e [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [clob] overrides previous : org.hibernate.type.ClobType@10592f4b [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Type registration [java.sql.Clob] overrides previous : org.hibernate.type.ClobType@10592f4b [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20170410-17:59:48: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170410-17:59:48: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170410-17:59:48: [INFO]: updating schema [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170410-17:59:48: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TASK_QUEUE [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: columns: [id, nextexecution, status, pid, trycount] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: indexes: [index_pid_task, index_exec_task, primary_key_1] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TRY_HISTORY [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: columns: [id, lastreplicationattemptdate, pid, replicationattempts, nodeid] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: indexes: [index_pid, primary_key_b] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170410-17:59:48: [INFO]: schema update complete [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170410-17:59:48: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@73be0bf6: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationH2RepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationPostgresRepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationTaskRepository,replicationAttemptHistoryRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:48: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@2ff4ae90: startup date [Mon Apr 10 17:59:48 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
Apr 10, 2017 5:59:48 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Initializing cluster partition table first arrangement...
20170410-17:59:48: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:48: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170410-17:59:48: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:48: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:48: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:48: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20170410-17:59:48: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20170410-17:59:48: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20170410-17:59:48: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20170410-17:59:48: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20170410-17:59:48: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20170410-17:59:48: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20170410-17:59:48: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20170410-17:59:48: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20170410-17:59:48: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20170410-17:59:48: [INFO]: Using dialect: org.hibernate.dialect.PostgreSQLDialect [org.hibernate.dialect.Dialect]
20170410-17:59:48: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20170410-17:59:48: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20170410-17:59:48: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20170410-17:59:48: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20170410-17:59:48: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20170410-17:59:48: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20170410-17:59:48: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@4e517449 [org.hibernate.type.BasicTypeRegistry]
20170410-17:59:48: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20170410-17:59:48: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170410-17:59:48: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170410-17:59:48: [ERROR]: could not get database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170410-17:59:48: [ERROR]: could not complete schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170410-17:59:48: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@d3840aa: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationPostgresRepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationH2RepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationTaskRepository,replicationAttemptHistoryRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170410-17:59:48: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20170410-17:59:48: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20170410-17:59:48: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170410-17:59:48: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170410-17:59:48: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:48: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:48: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:48: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170410-17:59:48: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:48: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:48: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:48: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20170410-17:59:48: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ where replicatio0_.status=? and replicatio0_.nextExecution<? order by replicatio0_.nextExecution asc limit ? [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20170410-17:59:48: [INFO]: RestClient.doRequestNoBody, thread(90) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v1/node [org.dataone.client.rest.RestClient]
20170410-17:59:48: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20170410-17:59:50: [INFO]: testCreateAndQueueTask replicationManager.createAndQueueTask [org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit]
20170410-17:59:50: [INFO]: node initial refresh: new cached time: Apr 10, 2017 5:59:50 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20170410-17:59:50: [INFO]: for pid: 42 source MN: urn:node:testmn1 service info: MNRead v1 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:50: [INFO]: for pid: 42 source MN: urn:node:testmn1 service info: MNRead v2 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:50: [INFO]: for pid: 42, source MN: urn:node:testmn1 requires v2 replication. [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:51: [INFO]: for pid: 42, target MN: urn:node:testmn5 supports v2 MNReplication. [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:51: [INFO]: for pid: 42, target MN: urn:node:testmn2 supports v2 MNReplication. [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:51: [INFO]: Retrieving performance metrics for the potential replication list for 42 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:51: [INFO]: Priority score for urn:node:testmn5 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:51: [INFO]: Priority score for urn:node:testmn2 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:51: [WARN]: In Replication Manager, task that should exist 'in process' does not exist.  Creating new task for pid: 42 [org.dataone.service.cn.replication.ReplicationManager]
Apr 10, 2017 5:59:51 PM com.hazelcast.impl.ClientHandlerService
SEVERE: [127.0.0.1]:5730 [DataONEBuildTest] null
java.lang.NullPointerException
	at com.hazelcast.impl.ConcurrentMapManager.doPutAll(ConcurrentMapManager.java:1025)
	at com.hazelcast.impl.ClientHandlerService$MapPutAllHandler.processMapOp(ClientHandlerService.java:895)
	at com.hazelcast.impl.ClientHandlerService$ClientMapOperationHandler.processCall(ClientHandlerService.java:1603)
	at com.hazelcast.impl.ClientHandlerService$ClientOperationHandler.handle(ClientHandlerService.java:1565)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:57)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:54)
	at com.hazelcast.impl.ClientRequestHandler.doRun(ClientRequestHandler.java:63)
	at com.hazelcast.impl.FallThroughRunnable.run(FallThroughRunnable.java:22)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
	at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38)

20170410-17:59:51: [INFO]: Number of replicas desired for identifier 42 is 3 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:51: [INFO]: Potential target node list size for 42 is 2 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:51: [INFO]: Changed the desired replicas for identifier 42 to the size of the potential target node list: 2 [org.dataone.service.cn.replication.ReplicationManager]
20170410-17:59:51: [INFO]: node initial refresh: new cached time: Apr 10, 2017 5:59:51 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20170410-17:59:51: [INFO]: node initial refresh: new cached time: Apr 10, 2017 5:59:51 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20170410-17:59:51: [INFO]: Added 2 MNReplicationTasks to the queue for 42 [org.dataone.service.cn.replication.ReplicationManager]
Apr 10, 2017 5:59:51 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTTING_DOWN
Apr 10, 2017 5:59:52 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Apr 10, 2017 5:59:52 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: java.io.EOFException[null]
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-1 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5732, Cause:java.io.EOFException
Apr 10, 2017 5:59:52 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Apr 10, 2017 5:59:52 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Apr 10, 2017 5:59:52 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Apr 10, 2017 5:59:52 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:40774] lost. Reason: Explicit close
Apr 10, 2017 5:59:52 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:40775] lost. Reason: Explicit close
Apr 10, 2017 5:59:52 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Apr 10, 2017 5:59:52 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Apr 10, 2017 5:59:52 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:33264
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:33264
Apr 10, 2017 5:59:52 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:33264 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:33265
Apr 10, 2017 5:59:52 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Apr 10, 2017 5:59:52 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:33265
Apr 10, 2017 5:59:52 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:33265 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Apr 10, 2017 5:59:52 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Apr 10, 2017 5:59:52 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Apr 10, 2017 5:59:52 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Apr 10, 2017 5:59:52 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Destroying node initializer.
Apr 10, 2017 5:59:52 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Shutdown is completed in 694 ms.
Apr 10, 2017 5:59:52 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTDOWN
Apr 10, 2017 5:59:52 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTTING_DOWN
Apr 10, 2017 5:59:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5732
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: Explicit close
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[0]. PartitionReplicaChangeEvent{partitionId=0, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[2]. PartitionReplicaChangeEvent{partitionId=2, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5732 [DataONEBuildTest] hz.hzProcessInstance2.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[9]. PartitionReplicaChangeEvent{partitionId=9, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[10]. PartitionReplicaChangeEvent{partitionId=10, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[11]. PartitionReplicaChangeEvent{partitionId=11, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[15]. PartitionReplicaChangeEvent{partitionId=15, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[18]. PartitionReplicaChangeEvent{partitionId=18, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[19]. PartitionReplicaChangeEvent{partitionId=19, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[24]. PartitionReplicaChangeEvent{partitionId=24, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[26]. PartitionReplicaChangeEvent{partitionId=26, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[29]. PartitionReplicaChangeEvent{partitionId=29, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[30]. PartitionReplicaChangeEvent{partitionId=30, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[32]. PartitionReplicaChangeEvent{partitionId=32, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[33]. PartitionReplicaChangeEvent{partitionId=33, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[35]. PartitionReplicaChangeEvent{partitionId=35, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[37]. PartitionReplicaChangeEvent{partitionId=37, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[38]. PartitionReplicaChangeEvent{partitionId=38, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[39]. PartitionReplicaChangeEvent{partitionId=39, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[44]. PartitionReplicaChangeEvent{partitionId=44, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[45]. PartitionReplicaChangeEvent{partitionId=45, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[46]. PartitionReplicaChangeEvent{partitionId=46, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[47]. PartitionReplicaChangeEvent{partitionId=47, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[50]. PartitionReplicaChangeEvent{partitionId=50, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[52]. PartitionReplicaChangeEvent{partitionId=52, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[56]. PartitionReplicaChangeEvent{partitionId=56, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[61]. PartitionReplicaChangeEvent{partitionId=61, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[64]. PartitionReplicaChangeEvent{partitionId=64, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[69]. PartitionReplicaChangeEvent{partitionId=69, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[79]. PartitionReplicaChangeEvent{partitionId=79, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[81]. PartitionReplicaChangeEvent{partitionId=81, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[82]. PartitionReplicaChangeEvent{partitionId=82, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[83]. PartitionReplicaChangeEvent{partitionId=83, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[85]. PartitionReplicaChangeEvent{partitionId=85, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[89]. PartitionReplicaChangeEvent{partitionId=89, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[92]. PartitionReplicaChangeEvent{partitionId=92, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[94]. PartitionReplicaChangeEvent{partitionId=94, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[100]. PartitionReplicaChangeEvent{partitionId=100, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[109]. PartitionReplicaChangeEvent{partitionId=109, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[113]. PartitionReplicaChangeEvent{partitionId=113, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[115]. PartitionReplicaChangeEvent{partitionId=115, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[118]. PartitionReplicaChangeEvent{partitionId=118, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[120]. PartitionReplicaChangeEvent{partitionId=120, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[124]. PartitionReplicaChangeEvent{partitionId=124, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[125]. PartitionReplicaChangeEvent{partitionId=125, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[134]. PartitionReplicaChangeEvent{partitionId=134, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[137]. PartitionReplicaChangeEvent{partitionId=137, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[141]. PartitionReplicaChangeEvent{partitionId=141, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[144]. PartitionReplicaChangeEvent{partitionId=144, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[146]. PartitionReplicaChangeEvent{partitionId=146, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[152]. PartitionReplicaChangeEvent{partitionId=152, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[153]. PartitionReplicaChangeEvent{partitionId=153, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[159]. PartitionReplicaChangeEvent{partitionId=159, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[165]. PartitionReplicaChangeEvent{partitionId=165, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[167]. PartitionReplicaChangeEvent{partitionId=167, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[169]. PartitionReplicaChangeEvent{partitionId=169, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[170]. PartitionReplicaChangeEvent{partitionId=170, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[171]. PartitionReplicaChangeEvent{partitionId=171, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[174]. PartitionReplicaChangeEvent{partitionId=174, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[175]. PartitionReplicaChangeEvent{partitionId=175, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[177]. PartitionReplicaChangeEvent{partitionId=177, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[179]. PartitionReplicaChangeEvent{partitionId=179, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[183]. PartitionReplicaChangeEvent{partitionId=183, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[185]. PartitionReplicaChangeEvent{partitionId=185, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[186]. PartitionReplicaChangeEvent{partitionId=186, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[187]. PartitionReplicaChangeEvent{partitionId=187, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[190]. PartitionReplicaChangeEvent{partitionId=190, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[195]. PartitionReplicaChangeEvent{partitionId=195, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[198]. PartitionReplicaChangeEvent{partitionId=198, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[199]. PartitionReplicaChangeEvent{partitionId=199, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[203]. PartitionReplicaChangeEvent{partitionId=203, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[206]. PartitionReplicaChangeEvent{partitionId=206, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[208]. PartitionReplicaChangeEvent{partitionId=208, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[216]. PartitionReplicaChangeEvent{partitionId=216, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[221]. PartitionReplicaChangeEvent{partitionId=221, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[223]. PartitionReplicaChangeEvent{partitionId=223, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[229]. PartitionReplicaChangeEvent{partitionId=229, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[231]. PartitionReplicaChangeEvent{partitionId=231, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[232]. PartitionReplicaChangeEvent{partitionId=232, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[233]. PartitionReplicaChangeEvent{partitionId=233, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[235]. PartitionReplicaChangeEvent{partitionId=235, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[240]. PartitionReplicaChangeEvent{partitionId=240, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[242]. PartitionReplicaChangeEvent{partitionId=242, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[244]. PartitionReplicaChangeEvent{partitionId=244, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[245]. PartitionReplicaChangeEvent{partitionId=245, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[246]. PartitionReplicaChangeEvent{partitionId=246, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[249]. PartitionReplicaChangeEvent{partitionId=249, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[251]. PartitionReplicaChangeEvent{partitionId=251, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[255]. PartitionReplicaChangeEvent{partitionId=255, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[256]. PartitionReplicaChangeEvent{partitionId=256, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[257]. PartitionReplicaChangeEvent{partitionId=257, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[260]. PartitionReplicaChangeEvent{partitionId=260, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[268]. PartitionReplicaChangeEvent{partitionId=268, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Apr 10, 2017 5:59:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [1] {
	Member [127.0.0.1]:5731 this
}

Apr 10, 2017 5:59:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:33265] lost. Reason: Explicit close
Apr 10, 2017 5:59:53 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:33264] lost. Reason: Explicit close
Apr 10, 2017 5:59:53 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Apr 10, 2017 5:59:53 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Apr 10, 2017 5:59:53 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:54267
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:54267
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:54267 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:54268
Apr 10, 2017 5:59:53 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Apr 10, 2017 5:59:53 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:54268 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Apr 10, 2017 5:59:53 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:54268
Apr 10, 2017 5:59:53 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Apr 10, 2017 5:59:53 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Apr 10, 2017 5:59:53 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Apr 10, 2017 5:59:54 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Destroying node initializer.
Apr 10, 2017 5:59:54 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Shutdown is completed in 1614 ms.
Apr 10, 2017 5:59:54 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTDOWN
Apr 10, 2017 5:59:54 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTTING_DOWN
Apr 10, 2017 5:59:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:54268] lost. Reason: Explicit close
Apr 10, 2017 5:59:54 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Apr 10, 2017 5:59:54 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Apr 10, 2017 5:59:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Apr 10, 2017 5:59:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Apr 10, 2017 5:59:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:54267] lost. Reason: Explicit close
Apr 10, 2017 5:59:54 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 4,999 ms later, attempt 1 of 1.
Apr 10, 2017 5:59:54 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 4,999 ms later, attempt 1 of 1.
Apr 10, 2017 5:59:55 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Destroying node initializer.
Apr 10, 2017 5:59:55 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Shutdown is completed in 1366 ms.
Apr 10, 2017 5:59:55 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTDOWN
20170410-17:59:55: [INFO]: Unbind of an LDAP service (9389) is complete. [org.apache.directory.server.ldap.LdapServer]
20170410-17:59:55: [INFO]: Sending notice of disconnect to existing clients sessions. [org.apache.directory.server.ldap.LdapServer]
20170410-17:59:55: [WARN]: javax.naming.CommunicationException: localhost:9389 connection closed [org.dataone.cn.ldap.DirContextUnsolicitedNotificationListener]
20170410-17:59:55: [INFO]: Ldap service stopped. [org.apache.directory.server.ldap.LdapServer]
20170410-17:59:55: [INFO]: clearing all the caches [org.apache.directory.server.core.api.CacheService]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.947 sec
Running org.dataone.service.cn.replication.v2.TestReplicationPrioritization
Node: node1 request factor: 1.0
Node: node4 request factor: 1.0
Node: node2 request factor: 0.0
Node: node3 request factor: 0.8333333
20170410-17:59:56: [INFO]: Retrieving performance metrics for the potential replication list for testPid1 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Node node3 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Priority score for node1 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Priority score for node2 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Priority score for node3 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Priority score for node4 is 2.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Removed node3, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170410-17:59:56: [INFO]: Removed node2, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node4
Node: node1
20170410-17:59:56: [INFO]: Node node1 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node1 request factor: 0.0
Node: node4 request factor: 1.0
Node: node2 request factor: 1.0
Node: node3 request factor: 1.0
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.043 sec
20170410-17:59:56: [INFO]: Closing org.springframework.context.support.GenericApplicationContext@572153d0: startup date [Mon Apr 10 17:59:27 UTC 2017]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20170410-17:59:56: [INFO]: Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@6d187d70: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]

Results :

Tests run: 15, Failures: 0, Errors: 0, Skipped: 0

[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ d1_replication ---
[INFO] Building jar: /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/d1_replication-2.4.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.3:install (default-install) @ d1_replication ---
[INFO] Installing /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/d1_replication-2.4.0-SNAPSHOT.jar to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-SNAPSHOT.jar
[INFO] Installing /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/pom.xml to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-SNAPSHOT.pom
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.4:create (default) @ d1_replication ---
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_replication/workspace/d1_replication' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication
[INFO] Storing buildNumber: 18760 at timestamp: 1491847199322
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_replication/workspace/d1_replication' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication
[INFO] Storing buildScmBranch: trunk
[WARNING] Failed to getClass for org.apache.maven.plugin.javadoc.JavadocReport
[INFO] 
[INFO] --- maven-javadoc-plugin:2.10.4:javadoc (default-cli) @ d1_replication ---
[INFO] 
Loading source files for package org.dataone.cn.data.repository...
Loading source files for package org.dataone.service.cn.replication...
Loading source files for package org.dataone.service.cn.replication.v2...
Loading source files for package org.dataone.service.cn.replication.v1...
Constructing Javadoc information...
Standard Doclet version 1.7.0_121
Building tree for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/overview-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/constant-values.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/serialized-form.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-use.html...
Building index for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/overview-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/index-all.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/deprecated-list.html...
Building index for all classes...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/allclasses-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/allclasses-noframe.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/index.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/overview-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/help-doc.html...
16 warnings
[WARNING] Javadoc Warnings
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:258: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:280: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:303: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:499: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:542: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:572: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:622: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:406: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:805: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:146: warning - @param argument "repAttemptHistoryRepos" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:140: warning - @param argument "serialVersion" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:331: warning - @param argument "session" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:82: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:99: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:117: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:206: warning - @return tag has no arguments.
[JENKINS] Archiving  javadoc
Notifying upstream projects of job completion
Join notifier requires a CauseAction
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:08.018s
[INFO] Finished at: Mon Apr 10 18:00:05 UTC 2017
[INFO] Final Memory: 56M/535M
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data
[JENKINS] Archiving /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/pom.xml to org.dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-SNAPSHOT.pom
[JENKINS] Archiving /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/d1_replication-2.4.0-SNAPSHOT.jar to org.dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-SNAPSHOT.jar
channel stopped
Maven RedeployPublisher use remote  maven settings from : /usr/share/maven/conf/settings.xml
[ERROR] uniqueVersion == false is not anymore supported in maven 3
[INFO] Deployment in file:///var/www/maven (id=,uniqueVersion=false)
Deploying the main artifact d1_replication-2.4.0-SNAPSHOT.jar
Downloading: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/maven-metadata.xml
Downloaded: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/maven-metadata.xml (775 B at 24.4 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-20170410.180006-2.jar
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-20170410.180006-2.jar (73 KB at 36278.8 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-20170410.180006-2.pom
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/d1_replication-2.4.0-20170410.180006-2.pom (8 KB at 7042.0 KB/sec)
Downloading: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml
Downloaded: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml (2 KB at 709.5 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/maven-metadata.xml
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.4.0-SNAPSHOT/maven-metadata.xml (775 B)
Uploading: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml
Uploaded: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml (2 KB at 1418.9 KB/sec)
[INFO] Deployment done in 0.26 sec
IRC notifier plugin: Sending notification to: #dataone-build
IRC notifier plugin: [ERROR] not connected. Cannot send message to '#dataone-build'
Notifying upstream projects of job completion
Notifying upstream of completion: Build_Dev_Level_4 #21
Project Build_Dev_Level_4 still waiting for [d1_cn_index_processor] builds to complete
Warning: you have no plugins providing access control for builds, so falling back to legacy behavior of permitting any downstream builds to be triggered
Not triggering d1_process_daemon because it has a dependency d1_cn_index_processor already building or in queue
Not triggering d1_replication_auditor because it has a dependency d1_cn_index_processor already building or in queue
Notifying upstream build Build_Dev_Level_4 #21 of job completion
Finished: SUCCESS