UnstableConsole Output

Skipping 226 KB.. Full Log
apped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeasesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpErrorLog [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubclassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassData [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPrimaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSecondaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSharedNetworkDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeaseDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:03: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:03: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:03: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:03: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:04: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:04: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:04: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20170216-20:23:04: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20170216-20:23:04: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20170216-20:23:04: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20170216-20:23:04: [INFO]: fetching the cache named system [org.apache.directory.server.core.api.CacheService]
20170216-20:23:04: [INFO]: No cache with name system exists, creating one [org.apache.directory.server.core.api.CacheService]
20170216-20:23:04: [INFO]: Keys and self signed certificate successfully generated. [org.apache.directory.server.core.security.TlsKeyGenerator]
20170216-20:23:05: [INFO]: fetching the cache named groupCache [org.apache.directory.server.core.api.CacheService]
20170216-20:23:05: [INFO]: Initializing ... [org.apache.directory.server.core.event.EventInterceptor]
20170216-20:23:05: [INFO]: Initialization complete. [org.apache.directory.server.core.event.EventInterceptor]
20170216-20:23:05: [WARN]: You didn't change the admin password of directory service instance 'org'.  Please update the admin password as soon as possible to prevent a possible security breach. [org.apache.directory.server.core.DefaultDirectoryService]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:23:05: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20170216-20:23:05: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20170216-20:23:05: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20170216-20:23:05: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20170216-20:23:05: [INFO]: fetching the cache named org [org.apache.directory.server.core.api.CacheService]
20170216-20:23:05: [INFO]: No cache with name org exists, creating one [org.apache.directory.server.core.api.CacheService]
20170216-20:23:05: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:05: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20170216-20:23:05: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20170216-20:23:06: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:07: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:23:10: [INFO]: Successful bind of an LDAP Service (9389) is completed. [org.apache.directory.server.ldap.LdapServer]
20170216-20:23:10: [INFO]: Ldap service started. [org.apache.directory.server.ldap.LdapServer]
20170216-20:23:10: [INFO]: Loading XML bean definitions from class path resource [org/dataone/configuration/testApplicationContext.xml] [org.springframework.beans.factory.xml.XmlBeanDefinitionReader]
20170216-20:23:10: [INFO]: Refreshing org.springframework.context.support.GenericApplicationContext@5ed4e588: startup date [Thu Feb 16 20:23:10 UTC 2017]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20170216-20:23:10: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@5adf0b65: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
Feb 16, 2017 8:23:10 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
Hazelcast Group Config:
GroupConfig [name=DataONEBuildTest, password=*******************]
Hazelcast Maps: hzSystemMetadata hzReplicationTasksMap hzNodes 
Hazelcast Queues: hzReplicationTasks 
Feb 16, 2017 8:23:10 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Feb 16, 2017 8:23:10 PM com.hazelcast.impl.AddressPicker
WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true.
Feb 16, 2017 8:23:10 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5730, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5730], bind any local is true
Feb 16, 2017 8:23:11 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5730
Feb 16, 2017 8:23:11 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Feb 16, 2017 8:23:11 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTING
Feb 16, 2017 8:23:11 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Feb 16, 2017 8:23:11 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Feb 16, 2017 8:23:11 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:23:11 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:23:12 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5730 this
}

Feb 16, 2017 8:23:12 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTED
Feb 16, 2017 8:23:12 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Feb 16, 2017 8:23:12 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5731, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5731], bind any local is true
Feb 16, 2017 8:23:12 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5731
Feb 16, 2017 8:23:12 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Feb 16, 2017 8:23:12 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTING
Feb 16, 2017 8:23:12 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Feb 16, 2017 8:23:12 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:23:12 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:33554
Feb 16, 2017 8:23:12 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 33554 accepted socket connection from /127.0.0.1:5730
Feb 16, 2017 8:23:12 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:33554
Feb 16, 2017 8:23:13 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:13 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:23:13 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:18 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:18 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
}

Feb 16, 2017 8:23:18 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
}

Feb 16, 2017 8:23:19 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:20 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTED
Feb 16, 2017 8:23:20 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Feb 16, 2017 8:23:20 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5732, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5732], bind any local is true
Feb 16, 2017 8:23:20 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5732
Feb 16, 2017 8:23:20 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Feb 16, 2017 8:23:20 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTING
Feb 16, 2017 8:23:20 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Feb 16, 2017 8:23:20 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 50155 accepted socket connection from /127.0.0.1:5730
Feb 16, 2017 8:23:20 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:50155
Feb 16, 2017 8:23:20 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:57834
Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:57834
Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:50155
Feb 16, 2017 8:23:20 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 57834 accepted socket connection from /127.0.0.1:5731
Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5731
Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:21 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732
}

Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Feb 16, 2017 8:23:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Feb 16, 2017 8:23:27 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:23:28 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTED
Hazelcast member hzMember name: hzProcessInstance
Hazelcast member h1 name: hzProcessInstance1
Hazelcast member h2 name: hzProcessInstance2
Cluster size 3
hzProcessInstance's InetSocketAddress: /127.0.0.1:5730
hzProcessInstance's InetSocketAddress: /127.0.0.1:5731
hzProcessInstance's InetSocketAddress: /127.0.0.1:5732
Feb 16, 2017 8:23:28 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcastTestClientConf.xml'.
20170216-20:23:28: [INFO]: group DataONEBuildTest addresses 127.0.0.1:5730 [org.dataone.cn.hazelcast.HazelcastClientFactory]
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Feb 16, 2017 8:23:28 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:47371
Feb 16, 2017 8:23:28 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:47371
Feb 16, 2017 8:23:28 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:47371 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Feb 16, 2017 8:23:28 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:47372
Feb 16, 2017 8:23:28 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:47372
Feb 16, 2017 8:23:28 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:47372 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:23:28 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
20170216-20:23:28: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:23:28: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:23:28: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:23:28: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:23:28: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:23:29: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: loading into client truststore: java.io.InputStreamReader@3b642de4 [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: 0 alias CN=DataONE Root CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: 1 alias CN=DataONE Production CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: 2 alias CN=CILogon Basic CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: 3 alias CN=CILogon OpenID CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: 4 alias CN=CILogon Silver CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: 5 alias CN=RapidSSL CA,O=GeoTrust\, Inc.,C=US [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:23:30: [WARN]: Starting monitor thread [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:23:30: [WARN]: Starting monitoring... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:23:30: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:23:31: [INFO]: RestClient.doRequestNoBody, thread(1) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v2/node [org.dataone.client.rest.RestClient]
20170216-20:23:31: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20170216-20:23:31: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@712344d8: startup date [Thu Feb 16 20:23:31 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Creating embedded database 'testdb' [org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory]
20170216-20:23:31: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20170216-20:23:31: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:23:31: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20170216-20:23:31: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20170216-20:23:31: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20170216-20:23:31: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20170216-20:23:31: [INFO]: Using dialect: org.hibernate.dialect.H2Dialect [org.hibernate.dialect.Dialect]
20170216-20:23:31: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20170216-20:23:31: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20170216-20:23:31: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20170216-20:23:31: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20170216-20:23:31: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20170216-20:23:31: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@45a98cca [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [wrapper_characters_clob] overrides previous : org.hibernate.type.CharacterArrayClobType@7e5a4580 [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [wrapper_materialized_blob] overrides previous : org.hibernate.type.WrappedMaterializedBlobType@5889174e [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [clob] overrides previous : org.hibernate.type.ClobType@10592f4b [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [java.sql.Clob] overrides previous : org.hibernate.type.ClobType@10592f4b [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [blob] overrides previous : org.hibernate.type.BlobType@4f2fed4f [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [java.sql.Blob] overrides previous : org.hibernate.type.BlobType@4f2fed4f [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [materialized_clob] overrides previous : org.hibernate.type.MaterializedClobType@53850626 [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Type registration [characters_clob] overrides previous : org.hibernate.type.PrimitiveCharacterArrayClobType@4256d3a0 [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20170216-20:23:31: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:23:31: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:23:31: [INFO]: updating schema [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:23:31: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TASK_QUEUE [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: columns: [id, nextexecution, status, pid, trycount] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: indexes: [index_pid_task, index_exec_task, primary_key_1] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TRY_HISTORY [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: columns: [id, lastreplicationattemptdate, pid, replicationattempts, nodeid] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: indexes: [index_pid, primary_key_b] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:23:31: [INFO]: schema update complete [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:23:31: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@1b3e8303: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationH2RepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationPostgresRepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationAttemptHistoryRepository,replicationTaskRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@5f1a797e: startup date [Thu Feb 16 20:23:31 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
Feb 16, 2017 8:23:31 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Initializing cluster partition table first arrangement...
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20170216-20:23:31: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:23:31: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20170216-20:23:31: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:23:31: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20170216-20:23:31: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20170216-20:23:31: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20170216-20:23:31: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20170216-20:23:31: [INFO]: Using dialect: org.hibernate.dialect.PostgreSQLDialect [org.hibernate.dialect.Dialect]
20170216-20:23:31: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20170216-20:23:31: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20170216-20:23:31: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20170216-20:23:31: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20170216-20:23:31: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:23:31: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20170216-20:23:31: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@45a98cca [org.hibernate.type.BasicTypeRegistry]
20170216-20:23:31: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20170216-20:23:31: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:23:31: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:23:31: [ERROR]: could not get database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170216-20:23:31: [ERROR]: could not complete schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170216-20:23:31: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@38517f22: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationPostgresRepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationH2RepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationAttemptHistoryRepository,replicationTaskRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:23:31: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20170216-20:23:31: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20170216-20:23:31: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:23:31: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:31: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:31: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:31: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:23:31: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:31: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:31: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:31: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20170216-20:23:31: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ where replicatio0_.status=? and replicatio0_.nextExecution<? order by replicatio0_.nextExecution asc limit ? [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20170216-20:23:33: [INFO]: testCreateAndQueueTask replicationManager.createAndQueueTask [org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit]
Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=16, migrating=null
partition=Partition [187]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=16, migrating=null
partition=Partition [269]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=16, migrating=null
partition=Partition [187]{
}
}
Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=17, migrating=null
partition=Partition [187]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=16, migrating=null
partition=Partition [269]{
}
}
Feb 16, 2017 8:23:39 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=17, migrating=null
partition=Partition [269]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=17, migrating=null
partition=Partition [187]{
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=18, migrating=null
partition=Partition [187]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=17, migrating=null
partition=Partition [269]{
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=18, migrating=null
partition=Partition [269]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=18, migrating=null
partition=Partition [187]{
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:__hz_Locks, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_LOCK, target=Address[127.0.0.1]:5731 / connected=true, redoCount=19, migrating=null
partition=Partition [187]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Handler -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, caller=Address[127.0.0.1]:5730 / connected=true, redoCount=18, migrating=null
partition=Partition [269]{
}
}
Feb 16, 2017 8:23:40 PM com.hazelcast.impl.ConcurrentMapManager
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] Caller -> RedoLog{name=c:hzSystemMetadata, redoType=REDO_TARGET_WRONG, operation=CONCURRENT_MAP_PUT, target=Address[127.0.0.1]:5731 / connected=true, redoCount=19, migrating=null
partition=Partition [269]{
	0:Address[127.0.0.1]:5731
	1:Address[127.0.0.1]:5732
	2:Address[127.0.0.1]:5730
}
}
20170216-20:23:41: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:23:41: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20170216-20:23:41: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:23:41: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:23:41: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:23:41: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:23:41: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:23:41: [WARN]: In Replication Manager, task that should exist 'in process' does not exist.  Creating new task for pid: 42 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:23:41: [ERROR]: Unhandled Exception for pid: 42. Error is : Could not retreive sysmeta from map for pid 42 [org.dataone.service.cn.replication.ReplicationManager]
org.dataone.service.exceptions.NotFound: Could not retreive sysmeta from map for pid 42
	at org.dataone.service.cn.replication.ReplicationManager.processPid(ReplicationManager.java:409)
	at org.dataone.service.cn.replication.ReplicationManager.createAndQueueTasks(ReplicationManager.java:366)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:186)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170216-20:23:41: [INFO]: Added 0 MNReplicationTasks to the queue for 42 [org.dataone.service.cn.replication.ReplicationManager]
Feb 16, 2017 8:23:41 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTTING_DOWN
20170216-20:23:41: [INFO]: RestClient.doRequestNoBody, thread(90) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v1/node [org.dataone.client.rest.RestClient]
20170216-20:23:41: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
Feb 16, 2017 8:23:41 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Feb 16, 2017 8:23:41 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: java.io.EOFException[null]
Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Feb 16, 2017 8:23:41 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5732, Cause:java.io.EOFException
Feb 16, 2017 8:23:41 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Feb 16, 2017 8:23:41 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-1 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Feb 16, 2017 8:23:41 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Feb 16, 2017 8:23:41 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:47372] lost. Reason: Explicit close
Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:47371] lost. Reason: Explicit close
Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:49987
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:49988
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:49987
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:49988
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49988 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49987 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:23:42 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Destroying node initializer.
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Shutdown is completed in 876 ms.
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTDOWN
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTTING_DOWN
Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5732
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: Explicit close
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5732 [DataONEBuildTest] hz.hzProcessInstance2.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[17]. PartitionReplicaChangeEvent{partitionId=17, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[31]. PartitionReplicaChangeEvent{partitionId=31, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[33]. PartitionReplicaChangeEvent{partitionId=33, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[36]. PartitionReplicaChangeEvent{partitionId=36, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[38]. PartitionReplicaChangeEvent{partitionId=38, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[39]. PartitionReplicaChangeEvent{partitionId=39, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[42]. PartitionReplicaChangeEvent{partitionId=42, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[44]. PartitionReplicaChangeEvent{partitionId=44, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[51]. PartitionReplicaChangeEvent{partitionId=51, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[53]. PartitionReplicaChangeEvent{partitionId=53, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[59]. PartitionReplicaChangeEvent{partitionId=59, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[61]. PartitionReplicaChangeEvent{partitionId=61, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[65]. PartitionReplicaChangeEvent{partitionId=65, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[66]. PartitionReplicaChangeEvent{partitionId=66, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[67]. PartitionReplicaChangeEvent{partitionId=67, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[69]. PartitionReplicaChangeEvent{partitionId=69, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[70]. PartitionReplicaChangeEvent{partitionId=70, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[74]. PartitionReplicaChangeEvent{partitionId=74, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[75]. PartitionReplicaChangeEvent{partitionId=75, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[76]. PartitionReplicaChangeEvent{partitionId=76, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[77]. PartitionReplicaChangeEvent{partitionId=77, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[79]. PartitionReplicaChangeEvent{partitionId=79, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[82]. PartitionReplicaChangeEvent{partitionId=82, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[87]. PartitionReplicaChangeEvent{partitionId=87, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[88]. PartitionReplicaChangeEvent{partitionId=88, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[92]. PartitionReplicaChangeEvent{partitionId=92, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[97]. PartitionReplicaChangeEvent{partitionId=97, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[98]. PartitionReplicaChangeEvent{partitionId=98, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[102]. PartitionReplicaChangeEvent{partitionId=102, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[105]. PartitionReplicaChangeEvent{partitionId=105, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[106]. PartitionReplicaChangeEvent{partitionId=106, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[110]. PartitionReplicaChangeEvent{partitionId=110, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[113]. PartitionReplicaChangeEvent{partitionId=113, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[114]. PartitionReplicaChangeEvent{partitionId=114, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[117]. PartitionReplicaChangeEvent{partitionId=117, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[119]. PartitionReplicaChangeEvent{partitionId=119, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[122]. PartitionReplicaChangeEvent{partitionId=122, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[123]. PartitionReplicaChangeEvent{partitionId=123, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[131]. PartitionReplicaChangeEvent{partitionId=131, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[133]. PartitionReplicaChangeEvent{partitionId=133, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[134]. PartitionReplicaChangeEvent{partitionId=134, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[136]. PartitionReplicaChangeEvent{partitionId=136, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[137]. PartitionReplicaChangeEvent{partitionId=137, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[140]. PartitionReplicaChangeEvent{partitionId=140, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[142]. PartitionReplicaChangeEvent{partitionId=142, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[143]. PartitionReplicaChangeEvent{partitionId=143, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[144]. PartitionReplicaChangeEvent{partitionId=144, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[151]. PartitionReplicaChangeEvent{partitionId=151, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[154]. PartitionReplicaChangeEvent{partitionId=154, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[155]. PartitionReplicaChangeEvent{partitionId=155, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[156]. PartitionReplicaChangeEvent{partitionId=156, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[157]. PartitionReplicaChangeEvent{partitionId=157, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[158]. PartitionReplicaChangeEvent{partitionId=158, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[162]. PartitionReplicaChangeEvent{partitionId=162, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[163]. PartitionReplicaChangeEvent{partitionId=163, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[165]. PartitionReplicaChangeEvent{partitionId=165, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[170]. PartitionReplicaChangeEvent{partitionId=170, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[173]. PartitionReplicaChangeEvent{partitionId=173, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[174]. PartitionReplicaChangeEvent{partitionId=174, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[175]. PartitionReplicaChangeEvent{partitionId=175, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[176]. PartitionReplicaChangeEvent{partitionId=176, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[178]. PartitionReplicaChangeEvent{partitionId=178, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[182]. PartitionReplicaChangeEvent{partitionId=182, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[190]. PartitionReplicaChangeEvent{partitionId=190, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[196]. PartitionReplicaChangeEvent{partitionId=196, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[197]. PartitionReplicaChangeEvent{partitionId=197, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[204]. PartitionReplicaChangeEvent{partitionId=204, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[205]. PartitionReplicaChangeEvent{partitionId=205, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[206]. PartitionReplicaChangeEvent{partitionId=206, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[207]. PartitionReplicaChangeEvent{partitionId=207, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[213]. PartitionReplicaChangeEvent{partitionId=213, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[214]. PartitionReplicaChangeEvent{partitionId=214, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[221]. PartitionReplicaChangeEvent{partitionId=221, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[222]. PartitionReplicaChangeEvent{partitionId=222, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[225]. PartitionReplicaChangeEvent{partitionId=225, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[228]. PartitionReplicaChangeEvent{partitionId=228, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[238]. PartitionReplicaChangeEvent{partitionId=238, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[247]. PartitionReplicaChangeEvent{partitionId=247, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[248]. PartitionReplicaChangeEvent{partitionId=248, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[249]. PartitionReplicaChangeEvent{partitionId=249, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[253]. PartitionReplicaChangeEvent{partitionId=253, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[256]. PartitionReplicaChangeEvent{partitionId=256, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[257]. PartitionReplicaChangeEvent{partitionId=257, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[258]. PartitionReplicaChangeEvent{partitionId=258, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[263]. PartitionReplicaChangeEvent{partitionId=263, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[264]. PartitionReplicaChangeEvent{partitionId=264, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[266]. PartitionReplicaChangeEvent{partitionId=266, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[270]. PartitionReplicaChangeEvent{partitionId=270, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Feb 16, 2017 8:23:42 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [1] {
	Member [127.0.0.1]:5731 this
}

Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:49988] lost. Reason: Explicit close
Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:49987] lost. Reason: Explicit close
Feb 16, 2017 8:23:42 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49090
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49091
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49090
Feb 16, 2017 8:23:42 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49091
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49090 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:23:42 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49091 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:23:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:23:43 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:23:43 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Destroying node initializer.
Feb 16, 2017 8:23:43 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Shutdown is completed in 1612 ms.
Feb 16, 2017 8:23:43 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTDOWN
Feb 16, 2017 8:23:43 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTTING_DOWN
Feb 16, 2017 8:23:44 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49091] lost. Reason: Explicit close
Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Feb 16, 2017 8:23:44 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:23:44 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49090] lost. Reason: Explicit close
Feb 16, 2017 8:23:44 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 4,999 ms later, attempt 1 of 1.
Feb 16, 2017 8:23:44 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 4,998 ms later, attempt 1 of 1.
Feb 16, 2017 8:23:45 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Destroying node initializer.
Feb 16, 2017 8:23:45 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Shutdown is completed in 1072 ms.
Feb 16, 2017 8:23:45 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTDOWN
20170216-20:23:45: [INFO]: Unbind of an LDAP service (9389) is complete. [org.apache.directory.server.ldap.LdapServer]
20170216-20:23:45: [INFO]: Sending notice of disconnect to existing clients sessions. [org.apache.directory.server.ldap.LdapServer]
20170216-20:23:45: [INFO]: Ldap service stopped. [org.apache.directory.server.ldap.LdapServer]
20170216-20:23:45: [INFO]: clearing all the caches [org.apache.directory.server.core.api.CacheService]
Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 45.667 sec <<< FAILURE!
Running org.dataone.service.cn.replication.v2.TestReplicationPrioritization
Node: node1 request factor: 1.0
Node: node4 request factor: 1.0
Node: node2 request factor: 0.0
Node: node3 request factor: 0.8333333
20170216-20:23:45: [INFO]: Retrieving performance metrics for the potential replication list for testPid1 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Node node3 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Priority score for node1 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Priority score for node2 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Priority score for node3 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Priority score for node4 is 2.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Removed node3, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:23:45: [INFO]: Removed node2, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node4
Node: node1
20170216-20:23:45: [INFO]: Node node1 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node1 request factor: 0.0
Node: node4 request factor: 1.0
Node: node2 request factor: 1.0
Node: node3 request factor: 1.0
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 sec
20170216-20:23:45: [INFO]: Closing org.springframework.context.support.GenericApplicationContext@5ed4e588: startup date [Thu Feb 16 20:23:10 UTC 2017]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20170216-20:23:45: [INFO]: Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@5adf0b65: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]

Results :

Failed tests:   testCreateAndQueueTasks(org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit): The number of tasks created should equal the replication policy numberOfReplicas. expected:<2> but was:<0>

Tests run: 15, Failures: 1, Errors: 0, Skipped: 0

[ERROR] There are test failures.

Please refer to /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/surefire-reports for the individual test results.
[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ d1_replication ---
[INFO] Building jar: /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/d1_replication-2.3.1.jar
[INFO] 
[INFO] --- maven-install-plugin:2.3:install (default-install) @ d1_replication ---
[INFO] Installing /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/d1_replication-2.3.1.jar to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.jar
[INFO] Installing /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/pom.xml to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.pom
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.4:create (default) @ d1_replication ---
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication
[INFO] Storing buildNumber: 18629 at timestamp: 1487276628692
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication
[INFO] Storing buildScmBranch: tags/D1_REPLICATION_v2.3.1
[WARNING] Failed to getClass for org.apache.maven.plugin.javadoc.JavadocReport
[INFO] 
[INFO] --- maven-javadoc-plugin:2.10.4:javadoc (default-cli) @ d1_replication ---
[INFO] 
Loading source files for package org.dataone.cn.data.repository...
Loading source files for package org.dataone.service.cn.replication...
Loading source files for package org.dataone.service.cn.replication.v2...
Loading source files for package org.dataone.service.cn.replication.v1...
Constructing Javadoc information...
Standard Doclet version 1.7.0_121
Building tree for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/overview-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/constant-values.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/serialized-form.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-use.html...
Building index for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/overview-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/index-all.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/deprecated-list.html...
Building index for all classes...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/allclasses-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/allclasses-noframe.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/index.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/overview-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/help-doc.html...
16 warnings
[WARNING] Javadoc Warnings
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:258: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:280: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:303: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:499: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:542: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:572: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:622: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:406: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:805: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:146: warning - @param argument "repAttemptHistoryRepos" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:140: warning - @param argument "serialVersion" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:331: warning - @param argument "session" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:82: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:99: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:117: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:206: warning - @return tag has no arguments.
[JENKINS] Archiving  javadoc
Notifying upstream projects of job completion
Join notifier requires a CauseAction
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:15.506s
[INFO] Finished at: Thu Feb 16 20:23:54 UTC 2017
[INFO] Final Memory: 56M/535M
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data