SuccessConsole Output

Skipping 229 KB.. Full Log
RR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpImplementation [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHashBucketAssignment [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDelayedServiceParameter [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpMaxClientLeadTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpFailOverEndpointState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRange [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPermitList [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeasesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOption [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpNetMask [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRange [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeasesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpErrorLog [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubclassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassData [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPrimaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSecondaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSharedNetworkDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeaseDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:23: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:24: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:24: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20170216-20:26:24: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20170216-20:26:24: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20170216-20:26:24: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20170216-20:26:24: [INFO]: fetching the cache named system [org.apache.directory.server.core.api.CacheService]
20170216-20:26:24: [INFO]: No cache with name system exists, creating one [org.apache.directory.server.core.api.CacheService]
20170216-20:26:24: [INFO]: Keys and self signed certificate successfully generated. [org.apache.directory.server.core.security.TlsKeyGenerator]
20170216-20:26:25: [INFO]: fetching the cache named groupCache [org.apache.directory.server.core.api.CacheService]
20170216-20:26:25: [INFO]: Initializing ... [org.apache.directory.server.core.event.EventInterceptor]
20170216-20:26:25: [INFO]: Initialization complete. [org.apache.directory.server.core.event.EventInterceptor]
20170216-20:26:25: [WARN]: You didn't change the admin password of directory service instance 'org'.  Please update the admin password as soon as possible to prevent a possible security breach. [org.apache.directory.server.core.DefaultDirectoryService]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20170216-20:26:25: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20170216-20:26:25: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20170216-20:26:25: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20170216-20:26:25: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20170216-20:26:25: [INFO]: fetching the cache named org [org.apache.directory.server.core.api.CacheService]
20170216-20:26:25: [INFO]: No cache with name org exists, creating one [org.apache.directory.server.core.api.CacheService]
20170216-20:26:25: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:25: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20170216-20:26:25: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20170216-20:26:27: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:27: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20170216-20:26:29: [INFO]: Successful bind of an LDAP Service (9389) is completed. [org.apache.directory.server.ldap.LdapServer]
20170216-20:26:29: [INFO]: Ldap service started. [org.apache.directory.server.ldap.LdapServer]
20170216-20:26:29: [INFO]: Loading XML bean definitions from class path resource [org/dataone/configuration/testApplicationContext.xml] [org.springframework.beans.factory.xml.XmlBeanDefinitionReader]
20170216-20:26:29: [INFO]: Refreshing org.springframework.context.support.GenericApplicationContext@50f547c5: startup date [Thu Feb 16 20:26:29 UTC 2017]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20170216-20:26:29: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@6d67b6d0: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
Feb 16, 2017 8:26:29 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
Hazelcast Group Config:
GroupConfig [name=DataONEBuildTest, password=*******************]
Hazelcast Maps: hzSystemMetadata hzReplicationTasksMap hzNodes 
Hazelcast Queues: hzReplicationTasks 
Feb 16, 2017 8:26:30 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Feb 16, 2017 8:26:30 PM com.hazelcast.impl.AddressPicker
WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true.
Feb 16, 2017 8:26:30 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5730, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5730], bind any local is true
Feb 16, 2017 8:26:30 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5730
Feb 16, 2017 8:26:30 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Feb 16, 2017 8:26:30 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTING
Feb 16, 2017 8:26:30 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Feb 16, 2017 8:26:30 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Feb 16, 2017 8:26:30 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:26:30 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:26:31 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:26:31 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:26:31 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5730 this
}

Feb 16, 2017 8:26:31 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTED
Feb 16, 2017 8:26:31 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Feb 16, 2017 8:26:31 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5731, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5731], bind any local is true
Feb 16, 2017 8:26:31 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5731
Feb 16, 2017 8:26:31 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Feb 16, 2017 8:26:31 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTING
Feb 16, 2017 8:26:31 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Feb 16, 2017 8:26:31 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Feb 16, 2017 8:26:31 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:26:31 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:33679
Feb 16, 2017 8:26:31 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 33679 accepted socket connection from /127.0.0.1:5730
Feb 16, 2017 8:26:31 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:33679
Feb 16, 2017 8:26:32 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:32 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Feb 16, 2017 8:26:32 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:37 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:37 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
}

Feb 16, 2017 8:26:37 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
}

Feb 16, 2017 8:26:38 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:39 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTED
Feb 16, 2017 8:26:39 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Feb 16, 2017 8:26:39 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5732, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5732], bind any local is true
Feb 16, 2017 8:26:39 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5732
Feb 16, 2017 8:26:39 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Feb 16, 2017 8:26:39 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTING
Feb 16, 2017 8:26:39 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Feb 16, 2017 8:26:39 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Feb 16, 2017 8:26:39 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:41724
Feb 16, 2017 8:26:39 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:41724
Feb 16, 2017 8:26:39 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 41724 accepted socket connection from /127.0.0.1:5730
Feb 16, 2017 8:26:39 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:38012
Feb 16, 2017 8:26:39 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 38012 accepted socket connection from /127.0.0.1:5731
Feb 16, 2017 8:26:39 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:38012
Feb 16, 2017 8:26:40 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:40 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5731
Feb 16, 2017 8:26:40 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:40 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:45 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:45 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732
}

Feb 16, 2017 8:26:45 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Feb 16, 2017 8:26:45 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Feb 16, 2017 8:26:46 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Feb 16, 2017 8:26:47 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTED
Hazelcast member hzMember name: hzProcessInstance
Hazelcast member h1 name: hzProcessInstance1
Hazelcast member h2 name: hzProcessInstance2
Cluster size 3
hzProcessInstance's InetSocketAddress: /127.0.0.1:5730
hzProcessInstance's InetSocketAddress: /127.0.0.1:5731
hzProcessInstance's InetSocketAddress: /127.0.0.1:5732
Feb 16, 2017 8:26:47 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcastTestClientConf.xml'.
20170216-20:26:47: [INFO]: group DataONEBuildTest addresses 127.0.0.1:5730 [org.dataone.cn.hazelcast.HazelcastClientFactory]
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Feb 16, 2017 8:26:47 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:47552
Feb 16, 2017 8:26:47 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:47552
Feb 16, 2017 8:26:47 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:47552 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Feb 16, 2017 8:26:47 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:47553
Feb 16, 2017 8:26:47 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:47553
Feb 16, 2017 8:26:47 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:47553 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:26:47 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
20170216-20:26:48: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: loading into client truststore: java.io.InputStreamReader@4ab6244f [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: 0 alias CN=DataONE Root CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: 1 alias CN=DataONE Production CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: 2 alias CN=CILogon Basic CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: 3 alias CN=CILogon OpenID CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: 4 alias CN=CILogon Silver CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: 5 alias CN=RapidSSL CA,O=GeoTrust\, Inc.,C=US [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:26:48: [WARN]: Starting monitor thread [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:26:48: [WARN]: Starting monitoring... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:26:48: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:26:49: [INFO]: RestClient.doRequestNoBody, thread(1) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v2/node [org.dataone.client.rest.RestClient]
20170216-20:26:49: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20170216-20:26:49: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@61a298f0: startup date [Thu Feb 16 20:26:49 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Creating embedded database 'testdb' [org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory]
20170216-20:26:49: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20170216-20:26:49: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20170216-20:26:49: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20170216-20:26:49: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:26:49: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20170216-20:26:49: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:26:49: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20170216-20:26:49: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20170216-20:26:49: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20170216-20:26:49: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20170216-20:26:49: [INFO]: Using dialect: org.hibernate.dialect.H2Dialect [org.hibernate.dialect.Dialect]
20170216-20:26:49: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20170216-20:26:49: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20170216-20:26:49: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20170216-20:26:49: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20170216-20:26:49: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:49: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20170216-20:26:49: [INFO]: Type registration [characters_clob] overrides previous : org.hibernate.type.PrimitiveCharacterArrayClobType@7e5a4580 [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [blob] overrides previous : org.hibernate.type.BlobType@5889174e [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [java.sql.Blob] overrides previous : org.hibernate.type.BlobType@5889174e [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [materialized_clob] overrides previous : org.hibernate.type.MaterializedClobType@10592f4b [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@4f2fed4f [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [wrapper_materialized_blob] overrides previous : org.hibernate.type.WrappedMaterializedBlobType@53850626 [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [clob] overrides previous : org.hibernate.type.ClobType@4256d3a0 [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [java.sql.Clob] overrides previous : org.hibernate.type.ClobType@4256d3a0 [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Type registration [wrapper_characters_clob] overrides previous : org.hibernate.type.CharacterArrayClobType@525fcf66 [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:49: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20170216-20:26:49: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:26:49: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:26:49: [INFO]: updating schema [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:26:49: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TASK_QUEUE [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: columns: [id, nextexecution, status, pid, trycount] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: indexes: [index_pid_task, index_exec_task, primary_key_1] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TRY_HISTORY [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: columns: [id, lastreplicationattemptdate, pid, replicationattempts, nodeid] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: indexes: [index_pid, primary_key_b] [org.hibernate.tool.hbm2ddl.TableMetadata]
20170216-20:26:49: [INFO]: schema update complete [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:26:49: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@263015db: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationH2RepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationPostgresRepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationTaskRepository,replicationAttemptHistoryRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@4d47391e: startup date [Thu Feb 16 20:26:49 UTC 2017]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
Feb 16, 2017 8:26:49 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Initializing cluster partition table first arrangement...
20170216-20:26:49: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20170216-20:26:49: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:49: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:49: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:26:49: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:49: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:50: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20170216-20:26:50: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20170216-20:26:50: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20170216-20:26:50: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:26:50: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20170216-20:26:50: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20170216-20:26:50: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20170216-20:26:50: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20170216-20:26:50: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20170216-20:26:50: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20170216-20:26:50: [INFO]: Using dialect: org.hibernate.dialect.PostgreSQLDialect [org.hibernate.dialect.Dialect]
20170216-20:26:50: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20170216-20:26:50: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20170216-20:26:50: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20170216-20:26:50: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20170216-20:26:50: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20170216-20:26:50: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20170216-20:26:50: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@4f2fed4f [org.hibernate.type.BasicTypeRegistry]
20170216-20:26:50: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20170216-20:26:50: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:26:50: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20170216-20:26:50: [ERROR]: could not get database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170216-20:26:50: [ERROR]: could not complete schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20170216-20:26:50: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@4fba185c: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationPostgresRepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationH2RepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationTaskRepository,replicationAttemptHistoryRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20170216-20:26:50: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20170216-20:26:50: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20170216-20:26:50: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:26:50: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20170216-20:26:50: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:50: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:50: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:50: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20170216-20:26:50: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:50: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:50: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:50: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20170216-20:26:50: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ where replicatio0_.status=? and replicatio0_.nextExecution<? order by replicatio0_.nextExecution asc limit ? [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20170216-20:26:50: [INFO]: RestClient.doRequestNoBody, thread(90) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v1/node [org.dataone.client.rest.RestClient]
20170216-20:26:50: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20170216-20:26:52: [INFO]: testCreateAndQueueTask replicationManager.createAndQueueTask [org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit]
20170216-20:26:52: [INFO]: node initial refresh: new cached time: Feb 16, 2017 8:26:52 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20170216-20:26:52: [INFO]: for pid: 42 source MN: urn:node:testmn1 service info: MNRead v2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:52: [INFO]: for pid: 42 source MN: urn:node:testmn1 service info: MNRead v1 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:52: [INFO]: for pid: 42, source MN: urn:node:testmn1 requires v2 replication. [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:53: [INFO]: for pid: 42, target MN: urn:node:testmn5 supports v2 MNReplication. [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:53: [INFO]: for pid: 42, target MN: urn:node:testmn2 supports v2 MNReplication. [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:53: [INFO]: Retrieving performance metrics for the potential replication list for 42 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:53: [INFO]: Priority score for urn:node:testmn5 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:53: [INFO]: Priority score for urn:node:testmn2 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:53: [WARN]: In Replication Manager, task that should exist 'in process' does not exist.  Creating new task for pid: 42 [org.dataone.service.cn.replication.ReplicationManager]
Feb 16, 2017 8:26:53 PM com.hazelcast.impl.ClientHandlerService
SEVERE: [127.0.0.1]:5730 [DataONEBuildTest] null
java.lang.NullPointerException
	at com.hazelcast.impl.ConcurrentMapManager.doPutAll(ConcurrentMapManager.java:1025)
	at com.hazelcast.impl.ClientHandlerService$MapPutAllHandler.processMapOp(ClientHandlerService.java:895)
	at com.hazelcast.impl.ClientHandlerService$ClientMapOperationHandler.processCall(ClientHandlerService.java:1603)
	at com.hazelcast.impl.ClientHandlerService$ClientOperationHandler.handle(ClientHandlerService.java:1565)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:57)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:54)
	at com.hazelcast.impl.ClientRequestHandler.doRun(ClientRequestHandler.java:63)
	at com.hazelcast.impl.FallThroughRunnable.run(FallThroughRunnable.java:22)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
	at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38)

20170216-20:26:53: [INFO]: Number of replicas desired for identifier 42 is 3 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:53: [INFO]: Potential target node list size for 42 is 2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:53: [INFO]: Changed the desired replicas for identifier 42 to the size of the potential target node list: 2 [org.dataone.service.cn.replication.ReplicationManager]
20170216-20:26:53: [INFO]: node initial refresh: new cached time: Feb 16, 2017 8:26:53 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20170216-20:26:53: [INFO]: node initial refresh: new cached time: Feb 16, 2017 8:26:53 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20170216-20:26:53: [INFO]: Added 2 MNReplicationTasks to the queue for 42 [org.dataone.service.cn.replication.ReplicationManager]
Feb 16, 2017 8:26:53 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTTING_DOWN
Feb 16, 2017 8:26:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Feb 16, 2017 8:26:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Feb 16, 2017 8:26:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Feb 16, 2017 8:26:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Feb 16, 2017 8:26:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: java.io.EOFException[null]
Feb 16, 2017 8:26:53 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Feb 16, 2017 8:26:53 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5732, Cause:java.io.EOFException
Feb 16, 2017 8:26:53 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-1 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Feb 16, 2017 8:26:53 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Feb 16, 2017 8:26:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Feb 16, 2017 8:26:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Feb 16, 2017 8:26:53 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:47552] lost. Reason: Explicit close
Feb 16, 2017 8:26:54 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:47553] lost. Reason: Explicit close
Feb 16, 2017 8:26:54 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:50166
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:50167
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:50166
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:50167
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:50166 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:50167 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:26:54 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Destroying node initializer.
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Shutdown is completed in 710 ms.
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTDOWN
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTTING_DOWN
Feb 16, 2017 8:26:54 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5732
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5732 [DataONEBuildTest] hz.hzProcessInstance2.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: Explicit close
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[0]. PartitionReplicaChangeEvent{partitionId=0, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[1]. PartitionReplicaChangeEvent{partitionId=1, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[2]. PartitionReplicaChangeEvent{partitionId=2, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[6]. PartitionReplicaChangeEvent{partitionId=6, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[7]. PartitionReplicaChangeEvent{partitionId=7, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[9]. PartitionReplicaChangeEvent{partitionId=9, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[16]. PartitionReplicaChangeEvent{partitionId=16, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[17]. PartitionReplicaChangeEvent{partitionId=17, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[18]. PartitionReplicaChangeEvent{partitionId=18, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[24]. PartitionReplicaChangeEvent{partitionId=24, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[25]. PartitionReplicaChangeEvent{partitionId=25, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[27]. PartitionReplicaChangeEvent{partitionId=27, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[29]. PartitionReplicaChangeEvent{partitionId=29, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[33]. PartitionReplicaChangeEvent{partitionId=33, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[38]. PartitionReplicaChangeEvent{partitionId=38, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[40]. PartitionReplicaChangeEvent{partitionId=40, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[50]. PartitionReplicaChangeEvent{partitionId=50, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[54]. PartitionReplicaChangeEvent{partitionId=54, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[55]. PartitionReplicaChangeEvent{partitionId=55, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[56]. PartitionReplicaChangeEvent{partitionId=56, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[65]. PartitionReplicaChangeEvent{partitionId=65, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[70]. PartitionReplicaChangeEvent{partitionId=70, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[74]. PartitionReplicaChangeEvent{partitionId=74, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[85]. PartitionReplicaChangeEvent{partitionId=85, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[86]. PartitionReplicaChangeEvent{partitionId=86, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[87]. PartitionReplicaChangeEvent{partitionId=87, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[89]. PartitionReplicaChangeEvent{partitionId=89, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[90]. PartitionReplicaChangeEvent{partitionId=90, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[93]. PartitionReplicaChangeEvent{partitionId=93, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[102]. PartitionReplicaChangeEvent{partitionId=102, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[103]. PartitionReplicaChangeEvent{partitionId=103, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[105]. PartitionReplicaChangeEvent{partitionId=105, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[108]. PartitionReplicaChangeEvent{partitionId=108, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[124]. PartitionReplicaChangeEvent{partitionId=124, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[127]. PartitionReplicaChangeEvent{partitionId=127, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[135]. PartitionReplicaChangeEvent{partitionId=135, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[138]. PartitionReplicaChangeEvent{partitionId=138, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[139]. PartitionReplicaChangeEvent{partitionId=139, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[141]. PartitionReplicaChangeEvent{partitionId=141, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[146]. PartitionReplicaChangeEvent{partitionId=146, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[147]. PartitionReplicaChangeEvent{partitionId=147, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[148]. PartitionReplicaChangeEvent{partitionId=148, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[150]. PartitionReplicaChangeEvent{partitionId=150, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[154]. PartitionReplicaChangeEvent{partitionId=154, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[156]. PartitionReplicaChangeEvent{partitionId=156, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[157]. PartitionReplicaChangeEvent{partitionId=157, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[159]. PartitionReplicaChangeEvent{partitionId=159, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[160]. PartitionReplicaChangeEvent{partitionId=160, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[164]. PartitionReplicaChangeEvent{partitionId=164, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[167]. PartitionReplicaChangeEvent{partitionId=167, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[168]. PartitionReplicaChangeEvent{partitionId=168, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[169]. PartitionReplicaChangeEvent{partitionId=169, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[171]. PartitionReplicaChangeEvent{partitionId=171, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[173]. PartitionReplicaChangeEvent{partitionId=173, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[174]. PartitionReplicaChangeEvent{partitionId=174, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[176]. PartitionReplicaChangeEvent{partitionId=176, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[180]. PartitionReplicaChangeEvent{partitionId=180, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[181]. PartitionReplicaChangeEvent{partitionId=181, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[183]. PartitionReplicaChangeEvent{partitionId=183, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[187]. PartitionReplicaChangeEvent{partitionId=187, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[188]. PartitionReplicaChangeEvent{partitionId=188, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[191]. PartitionReplicaChangeEvent{partitionId=191, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[192]. PartitionReplicaChangeEvent{partitionId=192, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[193]. PartitionReplicaChangeEvent{partitionId=193, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[201]. PartitionReplicaChangeEvent{partitionId=201, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[202]. PartitionReplicaChangeEvent{partitionId=202, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[204]. PartitionReplicaChangeEvent{partitionId=204, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[205]. PartitionReplicaChangeEvent{partitionId=205, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[206]. PartitionReplicaChangeEvent{partitionId=206, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[207]. PartitionReplicaChangeEvent{partitionId=207, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[218]. PartitionReplicaChangeEvent{partitionId=218, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[220]. PartitionReplicaChangeEvent{partitionId=220, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[226]. PartitionReplicaChangeEvent{partitionId=226, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[230]. PartitionReplicaChangeEvent{partitionId=230, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[233]. PartitionReplicaChangeEvent{partitionId=233, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[234]. PartitionReplicaChangeEvent{partitionId=234, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[241]. PartitionReplicaChangeEvent{partitionId=241, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[242]. PartitionReplicaChangeEvent{partitionId=242, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[244]. PartitionReplicaChangeEvent{partitionId=244, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[245]. PartitionReplicaChangeEvent{partitionId=245, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[246]. PartitionReplicaChangeEvent{partitionId=246, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[247]. PartitionReplicaChangeEvent{partitionId=247, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[251]. PartitionReplicaChangeEvent{partitionId=251, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[256]. PartitionReplicaChangeEvent{partitionId=256, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[258]. PartitionReplicaChangeEvent{partitionId=258, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[262]. PartitionReplicaChangeEvent{partitionId=262, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[264]. PartitionReplicaChangeEvent{partitionId=264, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[267]. PartitionReplicaChangeEvent{partitionId=267, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Feb 16, 2017 8:26:54 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [1] {
	Member [127.0.0.1]:5731 this
}

Feb 16, 2017 8:26:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:50166] lost. Reason: Explicit close
Feb 16, 2017 8:26:54 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:50167] lost. Reason: Explicit close
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49271
Feb 16, 2017 8:26:54 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49271
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49272
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49271 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:26:54 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49272 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:26:54 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49272
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:26:54 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Feb 16, 2017 8:26:55 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Destroying node initializer.
Feb 16, 2017 8:26:55 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Shutdown is completed in 1615 ms.
Feb 16, 2017 8:26:55 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTDOWN
Feb 16, 2017 8:26:55 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTTING_DOWN
Feb 16, 2017 8:26:56 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49272] lost. Reason: Explicit close
Feb 16, 2017 8:26:56 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Feb 16, 2017 8:26:56 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Feb 16, 2017 8:26:56 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:26:56 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49271] lost. Reason: Explicit close
Feb 16, 2017 8:26:56 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Feb 16, 2017 8:26:56 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 4,999 ms later, attempt 1 of 1.
Feb 16, 2017 8:26:56 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 4,999 ms later, attempt 1 of 1.
Feb 16, 2017 8:26:57 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Destroying node initializer.
Feb 16, 2017 8:26:57 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Shutdown is completed in 1300 ms.
Feb 16, 2017 8:26:57 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTDOWN
20170216-20:26:57: [INFO]: Unbind of an LDAP service (9389) is complete. [org.apache.directory.server.ldap.LdapServer]
20170216-20:26:57: [INFO]: Sending notice of disconnect to existing clients sessions. [org.apache.directory.server.ldap.LdapServer]
20170216-20:26:57: [INFO]: Ldap service stopped. [org.apache.directory.server.ldap.LdapServer]
20170216-20:26:57: [WARN]: javax.naming.CommunicationException: localhost:9389 connection closed [org.dataone.cn.ldap.DirContextUnsolicitedNotificationListener]
20170216-20:26:57: [INFO]: clearing all the caches [org.apache.directory.server.core.api.CacheService]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.631 sec
Running org.dataone.service.cn.replication.v2.TestReplicationPrioritization
Node: node1 request factor: 1.0
Node: node4 request factor: 1.0
Node: node2 request factor: 0.0
Node: node3 request factor: 0.8333333
20170216-20:26:57: [INFO]: Retrieving performance metrics for the potential replication list for testPid1 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Node node3 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Priority score for node1 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Priority score for node2 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Priority score for node3 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Priority score for node4 is 2.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Removed node3, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20170216-20:26:57: [INFO]: Removed node2, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node4
Node: node1
20170216-20:26:57: [INFO]: Node node1 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node1 request factor: 0.0
Node: node4 request factor: 1.0
Node: node2 request factor: 1.0
Node: node3 request factor: 1.0
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.036 sec
20170216-20:26:57: [INFO]: Closing org.springframework.context.support.GenericApplicationContext@50f547c5: startup date [Thu Feb 16 20:26:29 UTC 2017]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20170216-20:26:57: [INFO]: Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@6d67b6d0: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]

Results :

Tests run: 15, Failures: 0, Errors: 0, Skipped: 0

[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ d1_replication ---
[INFO] Building jar: /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/d1_replication-2.3.1.jar
[INFO] 
[INFO] --- maven-install-plugin:2.3:install (default-install) @ d1_replication ---
[INFO] Installing /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/d1_replication-2.3.1.jar to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.jar
[INFO] Installing /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/pom.xml to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.pom
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.4:create (default) @ d1_replication ---
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication
[INFO] Storing buildNumber: 18630 at timestamp: 1487276820558
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication
[INFO] Storing buildScmBranch: tags/D1_REPLICATION_v2.3.1
[WARNING] Failed to getClass for org.apache.maven.plugin.javadoc.JavadocReport
[INFO] 
[INFO] --- maven-javadoc-plugin:2.10.4:javadoc (default-cli) @ d1_replication ---
[INFO] 
Loading source files for package org.dataone.cn.data.repository...
Loading source files for package org.dataone.service.cn.replication...
Loading source files for package org.dataone.service.cn.replication.v2...
Loading source files for package org.dataone.service.cn.replication.v1...
Constructing Javadoc information...
Standard Doclet version 1.7.0_121
Building tree for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/overview-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/constant-values.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/serialized-form.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-use.html...
Building index for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/overview-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/index-all.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/deprecated-list.html...
Building index for all classes...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/allclasses-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/allclasses-noframe.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/index.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/overview-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/site/apidocs/help-doc.html...
16 warnings
[WARNING] Javadoc Warnings
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:258: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:280: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:303: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:499: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:542: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:572: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:622: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:406: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:805: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:146: warning - @param argument "repAttemptHistoryRepos" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:140: warning - @param argument "serialVersion" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:331: warning - @param argument "session" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:82: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:99: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:117: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:206: warning - @return tag has no arguments.
[JENKINS] Archiving  javadoc
Notifying upstream projects of job completion
Join notifier requires a CauseAction
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:04.937s
[INFO] Finished at: Thu Feb 16 20:27:06 UTC 2017
[INFO] Final Memory: 57M/535M
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data
[JENKINS] Archiving /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/pom.xml to org.dataone/d1_replication/2.3.1/d1_replication-2.3.1.pom
[JENKINS] Archiving /var/lib/jenkins/jobs/d1_replication_stable/workspace/d1_replication/target/d1_replication-2.3.1.jar to org.dataone/d1_replication/2.3.1/d1_replication-2.3.1.jar
channel stopped
Maven RedeployPublisher use remote  maven settings from : /usr/share/maven/conf/settings.xml
[ERROR] uniqueVersion == false is not anymore supported in maven 3
[INFO] Deployment in file:///var/www/maven (id=,uniqueVersion=false)
Deploying the main artifact d1_replication-2.3.1.jar
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.jar
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.jar (73 KB at 9069.6 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.pom
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.3.1/d1_replication-2.3.1.pom (7 KB)
Downloading: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml
Downloaded: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml (2 KB at 115.0 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml
Uploaded: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml (2 KB)
[INFO] Deployment done in 0.26 sec
IRC notifier plugin: Sending notification to: #dataone-build
IRC notifier plugin: [ERROR] not connected. Cannot send message to '#dataone-build'
Notifying upstream projects of job completion
Warning: you have no plugins providing access control for builds, so falling back to legacy behavior of permitting any downstream builds to be triggered
Finished: SUCCESS