SuccessConsole Output

Skipping 229 KB.. Full Log
i.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpMaxClientLeadTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpFailOverEndpointState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRange [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPermitList [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeasesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOption [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpNetMask [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRange [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeasesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPoolDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAddressState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpExpirationTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStartTimeOfState [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLastTransactionTime [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpBootpFlag [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDomainName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpDnsStatus [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRequestedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedHostName [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpReservedForClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpAssignedToClient [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpRelayAgentInfo [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpErrorLog [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubclassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassData [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpPrimaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSecondaryDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSharedNetworkDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpSubnetDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpGroupDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHostDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpClassesDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpLeaseDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpHWAddress [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpOptionsDN [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:09: [INFO]: ERR_04447_CANNOT_NORMALIZE_VALUE Cannot normalize the wrapped value ERR_04226 I do not know how to handle NameAndOptionalUID normalization with objects of class: dhcpStatements [org.apache.directory.api.ldap.model.entry.AbstractValue]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:10: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20160706-18:43:10: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20160706-18:43:10: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20160706-18:43:10: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20160706-18:43:10: [INFO]: fetching the cache named system [org.apache.directory.server.core.api.CacheService]
20160706-18:43:10: [INFO]: No cache with name system exists, creating one [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: Keys and self signed certificate successfully generated. [org.apache.directory.server.core.security.TlsKeyGenerator]
20160706-18:43:11: [INFO]: fetching the cache named groupCache [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: Initializing ... [org.apache.directory.server.core.event.EventInterceptor]
20160706-18:43:11: [INFO]: Initialization complete. [org.apache.directory.server.core.event.EventInterceptor]
20160706-18:43:11: [WARN]: You didn't change the admin password of directory service instance 'org'.  Please update the admin password as soon as possible to prevent a possible security breach. [org.apache.directory.server.core.DefaultDirectoryService]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmIndex]
20160706-18:43:11: [INFO]: fetching the cache named alias [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: fetching the cache named piar [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: fetching the cache named entryDn [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: Setting CacheRecondManager's cache size to 100 [org.apache.directory.server.core.partition.impl.btree.jdbm.JdbmPartition]
20160706-18:43:11: [INFO]: fetching the cache named org [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: No cache with name org exists, creating one [org.apache.directory.server.core.api.CacheService]
20160706-18:43:11: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:11: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20160706-18:43:11: [INFO]: Loading dataone enabled schema: 
	Schema Name: dataone
		Disabled: false
		Owner: 0.9.2342.19200300.100.1.1=admin,2.5.4.11=system
		Dependencies: [system, core] [org.apache.directory.api.ldap.schema.manager.impl.DefaultSchemaManager]
20160706-18:43:13: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:13: [INFO]: No version information : assuming version: 1 [org.apache.directory.api.ldap.model.ldif.LdifReader]
20160706-18:43:15: [INFO]: Successful bind of an LDAP Service (9389) is completed. [org.apache.directory.server.ldap.LdapServer]
20160706-18:43:15: [INFO]: Ldap service started. [org.apache.directory.server.ldap.LdapServer]
20160706-18:43:15: [INFO]: Loading XML bean definitions from class path resource [org/dataone/configuration/testApplicationContext.xml] [org.springframework.beans.factory.xml.XmlBeanDefinitionReader]
20160706-18:43:15: [INFO]: Refreshing org.springframework.context.support.GenericApplicationContext@afd6b6c: startup date [Wed Jul 06 18:43:15 UTC 2016]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20160706-18:43:15: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@9102c2e: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
Jul 06, 2016 6:43:16 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
Hazelcast Group Config:
GroupConfig [name=DataONEBuildTest, password=*******************]
Hazelcast Maps: hzSystemMetadata hzReplicationTasksMap hzNodes 
Hazelcast Queues: hzReplicationTasks 
Jul 06, 2016 6:43:16 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Jul 06, 2016 6:43:16 PM com.hazelcast.impl.AddressPicker
WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true.
Jul 06, 2016 6:43:16 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5730, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5730], bind any local is true
Jul 06, 2016 6:43:16 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5730
Jul 06, 2016 6:43:16 PM com.hazelcast.system
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Jul 06, 2016 6:43:16 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTING
Jul 06, 2016 6:43:16 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Jul 06, 2016 6:43:16 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Jul 06, 2016 6:43:16 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Jul 06, 2016 6:43:16 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Jul 06, 2016 6:43:17 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Jul 06, 2016 6:43:17 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Could not connect to: /127.0.0.1:5731. Reason: ConnectException[Connection refused]
Jul 06, 2016 6:43:17 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5730 this
}

Jul 06, 2016 6:43:17 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is STARTED
Jul 06, 2016 6:43:17 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Jul 06, 2016 6:43:17 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5731, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5731], bind any local is true
Jul 06, 2016 6:43:17 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5731
Jul 06, 2016 6:43:17 PM com.hazelcast.system
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Jul 06, 2016 6:43:17 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTING
Jul 06, 2016 6:43:17 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Jul 06, 2016 6:43:17 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5732
Jul 06, 2016 6:43:17 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Jul 06, 2016 6:43:17 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:57510
Jul 06, 2016 6:43:17 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 57510 accepted socket connection from /127.0.0.1:5730
Jul 06, 2016 6:43:17 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:57510
Jul 06, 2016 6:43:18 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:18 PM com.hazelcast.nio.SocketConnector
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Could not connect to: /127.0.0.1:5732. Reason: ConnectException[Connection refused]
Jul 06, 2016 6:43:18 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:23 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:23 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
}

Jul 06, 2016 6:43:23 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
}

Jul 06, 2016 6:43:24 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:25 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is STARTED
Jul 06, 2016 6:43:25 PM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Jul 06, 2016 6:43:25 PM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5732, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5732], bind any local is true
Jul 06, 2016 6:43:25 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5732
Jul 06, 2016 6:43:25 PM com.hazelcast.system
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Jul 06, 2016 6:43:25 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTING
Jul 06, 2016 6:43:25 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5730
Jul 06, 2016 6:43:25 PM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connecting to possible member: Address[127.0.0.1]:5731
Jul 06, 2016 6:43:25 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 54633 accepted socket connection from /127.0.0.1:5730
Jul 06, 2016 6:43:25 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:54633
Jul 06, 2016 6:43:25 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:54633
Jul 06, 2016 6:43:25 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:41286
Jul 06, 2016 6:43:25 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 41286 accepted socket connection from /127.0.0.1:5731
Jul 06, 2016 6:43:25 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:41286
Jul 06, 2016 6:43:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5731
Jul 06, 2016 6:43:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:26 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:31 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:31 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730 this
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732
}

Jul 06, 2016 6:43:31 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Jul 06, 2016 6:43:31 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [3] {
	Member [127.0.0.1]:5730
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Jul 06, 2016 6:43:32 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Sending join request to Address[127.0.0.1]:5730
Jul 06, 2016 6:43:33 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is STARTED
Hazelcast member hzMember name: hzProcessInstance
Hazelcast member h1 name: hzProcessInstance1
Hazelcast member h2 name: hzProcessInstance2
Cluster size 3
hzProcessInstance's InetSocketAddress: /127.0.0.1:5730
hzProcessInstance's InetSocketAddress: /127.0.0.1:5731
hzProcessInstance's InetSocketAddress: /127.0.0.1:5732
Jul 06, 2016 6:43:33 PM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcastTestClientConf.xml'.
20160706-18:43:33: [INFO]: group DataONEBuildTest addresses 127.0.0.1:5730 [org.dataone.cn.hazelcast.HazelcastClientFactory]
Jul 06, 2016 6:43:33 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Jul 06, 2016 6:43:34 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:57041
Jul 06, 2016 6:43:34 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:57041
Jul 06, 2016 6:43:34 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:57041 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTING
Jul 06, 2016 6:43:34 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 is accepting socket connection from /127.0.0.1:57042
Jul 06, 2016 6:43:34 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] 5730 accepted socket connection from /127.0.0.1:57042
Jul 06, 2016 6:43:34 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5730 [DataONEBuildTest] received auth from Connection [/127.0.0.1:57042 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Jul 06, 2016 6:43:34 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is STARTED
20160706-18:43:34: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: loading into client truststore: java.io.InputStreamReader@4adbaad8 [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: 0 alias CN=DataONE Root CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: 1 alias CN=DataONE Production CA,DC=dataone,DC=org [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: 2 alias CN=CILogon Basic CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: 3 alias CN=CILogon OpenID CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: 4 alias CN=CILogon Silver CA 1,O=CILogon,C=US,DC=cilogon,DC=org [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: 5 alias CN=RapidSSL CA,O=GeoTrust\, Inc.,C=US [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20160706-18:43:34: [WARN]: Starting monitor thread [org.dataone.client.utils.HttpConnectionMonitorService]
20160706-18:43:34: [WARN]: Starting monitoring... [org.dataone.client.utils.HttpConnectionMonitorService]
20160706-18:43:34: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20160706-18:43:35: [INFO]: RestClient.doRequestNoBody, thread(1) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v2/node [org.dataone.client.rest.RestClient]
20160706-18:43:35: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20160706-18:43:35: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@7d6e5880: startup date [Wed Jul 06 18:43:35 UTC 2016]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationH2RepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Creating embedded database 'testdb' [org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory]
20160706-18:43:35: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20160706-18:43:35: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20160706-18:43:35: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20160706-18:43:35: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20160706-18:43:35: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20160706-18:43:35: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20160706-18:43:35: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20160706-18:43:35: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20160706-18:43:35: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20160706-18:43:35: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20160706-18:43:35: [INFO]: Using dialect: org.hibernate.dialect.H2Dialect [org.hibernate.dialect.Dialect]
20160706-18:43:35: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20160706-18:43:35: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20160706-18:43:35: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20160706-18:43:35: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20160706-18:43:35: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20160706-18:43:35: [INFO]: Type registration [wrapper_materialized_blob] overrides previous : org.hibernate.type.WrappedMaterializedBlobType@78fab13e [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@128a9b7f [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [blob] overrides previous : org.hibernate.type.BlobType@49e6b85b [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [java.sql.Blob] overrides previous : org.hibernate.type.BlobType@49e6b85b [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [wrapper_characters_clob] overrides previous : org.hibernate.type.CharacterArrayClobType@4a518444 [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [characters_clob] overrides previous : org.hibernate.type.PrimitiveCharacterArrayClobType@2dc2b27a [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [materialized_clob] overrides previous : org.hibernate.type.MaterializedClobType@49537f0e [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [clob] overrides previous : org.hibernate.type.ClobType@8ba95be [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Type registration [java.sql.Clob] overrides previous : org.hibernate.type.ClobType@8ba95be [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20160706-18:43:35: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20160706-18:43:35: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20160706-18:43:35: [INFO]: updating schema [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20160706-18:43:35: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TASK_QUEUE [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: columns: [id, nextexecution, status, pid, trycount] [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: indexes: [index_pid_task, index_exec_task, primary_key_1] [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: table found: TESTDB.PUBLIC.REPLICATION_TRY_HISTORY [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: columns: [id, lastreplicationattemptdate, pid, replicationattempts, nodeid] [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: foreign keys: [] [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: indexes: [index_pid, primary_key_b] [org.hibernate.tool.hbm2ddl.TableMetadata]
20160706-18:43:35: [INFO]: schema update complete [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20160706-18:43:35: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@206e09ce: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationH2RepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationPostgresRepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationTaskRepository,replicationAttemptHistoryRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@72291d6: startup date [Wed Jul 06 18:43:35 UTC 2016]; root of context hierarchy [org.springframework.context.annotation.AnnotationConfigApplicationContext]
Jul 06, 2016 6:43:35 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Initializing cluster partition table first arrangement...
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'replicationAttemptHistoryRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: selectSession: using the default certificate location [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'replicationTaskRepository': replacing [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Root bean: class [org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'dataSource': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Overriding bean definition for bean 'jpaVendorAdapter': replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationH2RepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/dataone/cn/data/repository/ReplicationH2RepositoryFactory.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=replicationPostgresRepositoryFactory; factoryMethodName=jpaVendorAdapter; initMethodName=null; destroyMethodName=(inferred); defined in class org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory] [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: Building JPA container EntityManagerFactory for persistence unit 'default' [org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean]
20160706-18:43:35: [INFO]: Processing PersistenceUnitInfo [
	name: default
	...] [org.hibernate.ejb.Ejb3Configuration]
20160706-18:43:35: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationTask [org.hibernate.cfg.AnnotationBinder]
20160706-18:43:35: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationTask on table replication_task_queue [org.hibernate.cfg.annotations.EntityBinder]
20160706-18:43:35: [INFO]: Binding entity from annotated class: org.dataone.cn.data.repository.ReplicationAttemptHistory [org.hibernate.cfg.AnnotationBinder]
20160706-18:43:35: [INFO]: Bind entity org.dataone.cn.data.repository.ReplicationAttemptHistory on table replication_try_history [org.hibernate.cfg.annotations.EntityBinder]
20160706-18:43:35: [INFO]: Hibernate Validator not found: ignoring [org.hibernate.cfg.Configuration]
20160706-18:43:35: [INFO]: Unable to find org.hibernate.search.event.FullTextIndexEventListener on the classpath. Hibernate Search is not enabled. [org.hibernate.cfg.search.HibernateSearchEventListenerRegister]
20160706-18:43:35: [INFO]: Initializing connection provider: org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider [org.hibernate.connection.ConnectionProviderFactory]
20160706-18:43:35: [INFO]: Using provided datasource [org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider]
20160706-18:43:35: [INFO]: Using dialect: org.hibernate.dialect.PostgreSQLDialect [org.hibernate.dialect.Dialect]
20160706-18:43:35: [INFO]: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 [org.hibernate.engine.jdbc.JdbcSupportLoader]
20160706-18:43:35: [INFO]: Database ->
       name : H2
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Driver ->
       name : H2 JDBC Driver
    version : 1.3.163 (2011-12-30)
      major : 1
      minor : 3 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Using default transaction strategy (direct JDBC transactions) [org.hibernate.transaction.TransactionFactoryFactory]
20160706-18:43:35: [INFO]: No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) [org.hibernate.transaction.TransactionManagerLookupFactory]
20160706-18:43:35: [INFO]: Automatic flush during beforeCompletion(): disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Automatic session close at end of transaction: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JDBC batch size: 15 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JDBC batch updates for versioned data: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Scrollable result sets: enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JDBC3 getGeneratedKeys(): enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Connection release mode: auto [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Default batch fetch size: 1 [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Generate SQL with comments: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Order SQL updates by primary key: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Order SQL inserts for batching: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Using ASTQueryTranslatorFactory [org.hibernate.hql.ast.ASTQueryTranslatorFactory]
20160706-18:43:35: [INFO]: Query language substitutions: {} [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: JPA-QL strict compliance: enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Second-level cache: enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Query cache: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Optimize cache for minimal puts: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Structured second-level cache entries: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Statistics: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Deleted entity synthetic identifier rollback: disabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Default entity-mode: pojo [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Named query checking : enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: Check Nullability in Core (should be disabled when Bean Validation is on): enabled [org.hibernate.cfg.SettingsFactory]
20160706-18:43:35: [INFO]: building session factory [org.hibernate.impl.SessionFactoryImpl]
20160706-18:43:35: [INFO]: Type registration [materialized_blob] overrides previous : org.hibernate.type.MaterializedBlobType@128a9b7f [org.hibernate.type.BasicTypeRegistry]
20160706-18:43:35: [INFO]: Not binding factory to JNDI, no JNDI name configured [org.hibernate.impl.SessionFactoryObjectFactory]
20160706-18:43:35: [INFO]: Running hbm2ddl schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20160706-18:43:35: [INFO]: fetching database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
20160706-18:43:35: [ERROR]: could not get database metadata [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20160706-18:43:35: [ERROR]: could not complete schema update [org.hibernate.tool.hbm2ddl.SchemaUpdate]
org.h2.jdbc.JdbcSQLException: Table "PG_CLASS" not found; SQL statement:
select relname from pg_class where relkind='S' [42102-163]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
	at org.h2.message.DbException.get(DbException.java:169)
	at org.h2.message.DbException.get(DbException.java:146)
	at org.h2.command.Parser.readTableOrView(Parser.java:4758)
	at org.h2.command.Parser.readTableFilter(Parser.java:1080)
	at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1686)
	at org.h2.command.Parser.parseSelectSimple(Parser.java:1793)
	at org.h2.command.Parser.parseSelectSub(Parser.java:1680)
	at org.h2.command.Parser.parseSelectUnion(Parser.java:1523)
	at org.h2.command.Parser.parseSelect(Parser.java:1511)
	at org.h2.command.Parser.parsePrepared(Parser.java:405)
	at org.h2.command.Parser.parse(Parser.java:279)
	at org.h2.command.Parser.parse(Parser.java:251)
	at org.h2.command.Parser.prepareCommand(Parser.java:217)
	at org.h2.engine.Session.prepareLocal(Session.java:415)
	at org.h2.engine.Session.prepareCommand(Session.java:364)
	at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1121)
	at org.h2.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:70)
	at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.initSequences(DatabaseMetadata.java:151)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:69)
	at org.hibernate.tool.hbm2ddl.DatabaseMetadata.<init>(DatabaseMetadata.java:62)
	at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:170)
	at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:375)
	at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
	at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)
	at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)
	at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:287)
	at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
	at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1105)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:915)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:472)
	at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
	at org.dataone.cn.model.repository.D1BaseJpaRepositoryConfiguration.initContext(D1BaseJpaRepositoryConfiguration.java:39)
	at org.dataone.cn.data.repository.ReplicationPostgresRepositoryFactory.getReplicationTaskRepository(ReplicationPostgresRepositoryFactory.java:68)
	at org.dataone.service.cn.replication.ReplicationFactory.getReplicationTaskRepository(ReplicationFactory.java:74)
	at org.dataone.service.cn.replication.ReplicationTaskProcessor.<clinit>(ReplicationTaskProcessor.java:20)
	at org.dataone.service.cn.replication.ReplicationManager.startReplicationTaskProcessing(ReplicationManager.java:1028)
	at org.dataone.service.cn.replication.ReplicationManager.<init>(ReplicationManager.java:183)
	at org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit.testCreateAndQueueTasks(ReplicationManagerTestUnit.java:175)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:24)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
	at org.dataone.test.apache.directory.server.integ.ApacheDSSuiteRunner.run(ApacheDSSuiteRunner.java:208)
	at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
	at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
	at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
20160706-18:43:35: [INFO]: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@37182c1: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,replicationPostgresRepositoryFactory,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,replicationH2RepositoryFactory,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#0,replicationAttemptHistoryRepository,replicationTaskRepository,org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor#1,dataSource,jpaVendorAdapter,entityManagerFactory,transactionManager]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]
20160706-18:43:35: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20160706-18:43:35: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20160706-18:43:35: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20160706-18:43:35: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: Did not find a certificate for the subject specified: null [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ...trying SSLContext protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ...setting SSLContext with protocol: TLSv1.2 [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: creating custom TrustManager [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: getSSLConnectionSocketFactory: using allow-all hostname verifier [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [WARN]: registering ConnectionManager... [org.dataone.client.utils.HttpConnectionMonitorService]
20160706-18:43:35: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:35: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:35: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:35: [INFO]: selectSession: Using client certificate location: /etc/dataone/client/certs/null [org.dataone.client.auth.CertificateManager]
20160706-18:43:35: [INFO]: ReplicationManager is using an X509 certificate from /etc/dataone/client/certs/null [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:35: [INFO]: initialization [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:35: [INFO]: ReplicationManager D1Client base_url is: https://cn-dev-ucsb-1.test.dataone.org/cn/v2 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:35: [WARN]: SQL Error: 42102, SQLState: 42S02 [org.hibernate.util.JDBCExceptionReporter]
20160706-18:43:35: [ERROR]: Table "REPLICATION_TASK_QUEUE" not found; SQL statement:
select replicatio0_.id as id48_, replicatio0_.nextExecution as nextExec2_48_, replicatio0_.pid as pid48_, replicatio0_.status as status48_, replicatio0_.tryCount as tryCount48_ from replication_task_queue replicatio0_ where replicatio0_.status=? and replicatio0_.nextExecution<? order by replicatio0_.nextExecution asc limit ? [42102-163] [org.hibernate.util.JDBCExceptionReporter]
20160706-18:43:36: [INFO]: RestClient.doRequestNoBody, thread(90) call Info: GET https://cn-dev-ucsb-1.test.dataone.org/cn/v1/node [org.dataone.client.rest.RestClient]
20160706-18:43:36: [INFO]: response httpCode: 200 [org.dataone.service.util.ExceptionHandler]
20160706-18:43:37: [INFO]: testCreateAndQueueTask replicationManager.createAndQueueTask [org.dataone.service.cn.replication.v2.ReplicationManagerTestUnit]
20160706-18:43:38: [INFO]: node initial refresh: new cached time: Jul 6, 2016 6:43:38 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20160706-18:43:38: [INFO]: for pid: 42 source MN: urn:node:testmn1 service info: MNRead v2 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:38: [INFO]: for pid: 42 source MN: urn:node:testmn1 service info: MNRead v1 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:38: [INFO]: for pid: 42, source MN: urn:node:testmn1 requires v2 replication. [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:38: [INFO]: for pid: 42, target MN: urn:node:testmn5 supports v2 MNReplication. [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:39: [INFO]: for pid: 42, target MN: urn:node:testmn2 supports v2 MNReplication. [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:39: [INFO]: Retrieving performance metrics for the potential replication list for 42 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:39: [INFO]: Priority score for urn:node:testmn5 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:39: [INFO]: Priority score for urn:node:testmn2 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:39: [WARN]: In Replication Manager, task that should exist 'in process' does not exist.  Creating new task for pid: 42 [org.dataone.service.cn.replication.ReplicationManager]
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.ClientHandlerService
SEVERE: [127.0.0.1]:5730 [DataONEBuildTest] null
java.lang.NullPointerException
	at com.hazelcast.impl.ConcurrentMapManager.doPutAll(ConcurrentMapManager.java:1025)
	at com.hazelcast.impl.ClientHandlerService$MapPutAllHandler.processMapOp(ClientHandlerService.java:895)
	at com.hazelcast.impl.ClientHandlerService$ClientMapOperationHandler.processCall(ClientHandlerService.java:1603)
	at com.hazelcast.impl.ClientHandlerService$ClientOperationHandler.handle(ClientHandlerService.java:1565)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:57)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:54)
	at com.hazelcast.impl.ClientRequestHandler.doRun(ClientRequestHandler.java:63)
	at com.hazelcast.impl.FallThroughRunnable.run(FallThroughRunnable.java:22)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
	at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38)

20160706-18:43:39: [INFO]: Number of replicas desired for identifier 42 is 3 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:39: [INFO]: Potential target node list size for 42 is 2 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:39: [INFO]: Changed the desired replicas for identifier 42 to the size of the potential target node list: 2 [org.dataone.service.cn.replication.ReplicationManager]
20160706-18:43:39: [INFO]: node initial refresh: new cached time: Jul 6, 2016 6:43:39 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20160706-18:43:39: [INFO]: node initial refresh: new cached time: Jul 6, 2016 6:43:39 PM [org.dataone.service.cn.v2.impl.NodeRegistryServiceImpl]
20160706-18:43:39: [INFO]: Added 2 MNReplicationTasks to the queue for 42 [org.dataone.service.cn.replication.ReplicationManager]
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTTING_DOWN
Jul 06, 2016 6:43:39 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Jul 06, 2016 6:43:39 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: java.io.EOFException[null]
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-1 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5730] lost. Reason: Explicit close
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5730 [DataONEBuildTest] hz.hzProcessInstance.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5732, Cause:java.io.EOFException
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Jul 06, 2016 6:43:39 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731
	Member [127.0.0.1]:5732 this
}

Jul 06, 2016 6:43:39 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [2] {
	Member [127.0.0.1]:5731 this
	Member [127.0.0.1]:5732
}

Jul 06, 2016 6:43:39 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5730
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:57042] lost. Reason: Explicit close
Jul 06, 2016 6:43:39 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Jul 06, 2016 6:43:39 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Connection [Address[127.0.0.1]:57041] lost. Reason: Explicit close
Jul 06, 2016 6:43:39 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [0] [localhost/127.0.0.1:5730 -> 127.0.0.1:5730] is lost
Jul 06, 2016 6:43:39 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:41073
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:41073
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 is accepting socket connection from /127.0.0.1:41074
Jul 06, 2016 6:43:39 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5732 [DataONEBuildTest] 5732 accepted socket connection from /127.0.0.1:41074
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:41073 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5732 [DataONEBuildTest] received auth from Connection [/127.0.0.1:41074 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Jul 06, 2016 6:43:39 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Jul 06, 2016 6:43:39 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Jul 06, 2016 6:43:39 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Jul 06, 2016 6:43:39 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Jul 06, 2016 6:43:39 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Destroying node initializer.
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Hazelcast Shutdown is completed in 725 ms.
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5730 [DataONEBuildTest] Address[127.0.0.1]:5730 is SHUTDOWN
Jul 06, 2016 6:43:39 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTTING_DOWN
Jul 06, 2016 6:43:40 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Removing Address Address[127.0.0.1]:5732
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:5732] lost. Reason: Explicit close
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:5731] lost. Reason: java.io.EOFException[null]
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.ReadHandler
WARNING: [127.0.0.1]:5732 [DataONEBuildTest] hz.hzProcessInstance2.IO.thread-2 Closing socket to endpoint Address[127.0.0.1]:5731, Cause:java.io.EOFException
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[2]. PartitionReplicaChangeEvent{partitionId=2, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[3]. PartitionReplicaChangeEvent{partitionId=3, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[4]. PartitionReplicaChangeEvent{partitionId=4, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[5]. PartitionReplicaChangeEvent{partitionId=5, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[11]. PartitionReplicaChangeEvent{partitionId=11, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[15]. PartitionReplicaChangeEvent{partitionId=15, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[19]. PartitionReplicaChangeEvent{partitionId=19, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[21]. PartitionReplicaChangeEvent{partitionId=21, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[23]. PartitionReplicaChangeEvent{partitionId=23, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[29]. PartitionReplicaChangeEvent{partitionId=29, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[30]. PartitionReplicaChangeEvent{partitionId=30, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[32]. PartitionReplicaChangeEvent{partitionId=32, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[37]. PartitionReplicaChangeEvent{partitionId=37, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[41]. PartitionReplicaChangeEvent{partitionId=41, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[45]. PartitionReplicaChangeEvent{partitionId=45, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[48]. PartitionReplicaChangeEvent{partitionId=48, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[49]. PartitionReplicaChangeEvent{partitionId=49, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[50]. PartitionReplicaChangeEvent{partitionId=50, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[52]. PartitionReplicaChangeEvent{partitionId=52, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[59]. PartitionReplicaChangeEvent{partitionId=59, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[61]. PartitionReplicaChangeEvent{partitionId=61, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[62]. PartitionReplicaChangeEvent{partitionId=62, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[63]. PartitionReplicaChangeEvent{partitionId=63, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[64]. PartitionReplicaChangeEvent{partitionId=64, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[66]. PartitionReplicaChangeEvent{partitionId=66, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[68]. PartitionReplicaChangeEvent{partitionId=68, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[74]. PartitionReplicaChangeEvent{partitionId=74, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[75]. PartitionReplicaChangeEvent{partitionId=75, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[87]. PartitionReplicaChangeEvent{partitionId=87, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[90]. PartitionReplicaChangeEvent{partitionId=90, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[91]. PartitionReplicaChangeEvent{partitionId=91, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[92]. PartitionReplicaChangeEvent{partitionId=92, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[93]. PartitionReplicaChangeEvent{partitionId=93, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[97]. PartitionReplicaChangeEvent{partitionId=97, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[101]. PartitionReplicaChangeEvent{partitionId=101, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[102]. PartitionReplicaChangeEvent{partitionId=102, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[103]. PartitionReplicaChangeEvent{partitionId=103, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[108]. PartitionReplicaChangeEvent{partitionId=108, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[111]. PartitionReplicaChangeEvent{partitionId=111, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[112]. PartitionReplicaChangeEvent{partitionId=112, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[113]. PartitionReplicaChangeEvent{partitionId=113, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[115]. PartitionReplicaChangeEvent{partitionId=115, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[116]. PartitionReplicaChangeEvent{partitionId=116, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[123]. PartitionReplicaChangeEvent{partitionId=123, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[125]. PartitionReplicaChangeEvent{partitionId=125, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[126]. PartitionReplicaChangeEvent{partitionId=126, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[129]. PartitionReplicaChangeEvent{partitionId=129, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[137]. PartitionReplicaChangeEvent{partitionId=137, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[139]. PartitionReplicaChangeEvent{partitionId=139, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[141]. PartitionReplicaChangeEvent{partitionId=141, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[142]. PartitionReplicaChangeEvent{partitionId=142, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[145]. PartitionReplicaChangeEvent{partitionId=145, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[146]. PartitionReplicaChangeEvent{partitionId=146, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[148]. PartitionReplicaChangeEvent{partitionId=148, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[149]. PartitionReplicaChangeEvent{partitionId=149, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[152]. PartitionReplicaChangeEvent{partitionId=152, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[153]. PartitionReplicaChangeEvent{partitionId=153, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[158]. PartitionReplicaChangeEvent{partitionId=158, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[164]. PartitionReplicaChangeEvent{partitionId=164, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[167]. PartitionReplicaChangeEvent{partitionId=167, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[168]. PartitionReplicaChangeEvent{partitionId=168, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[172]. PartitionReplicaChangeEvent{partitionId=172, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[176]. PartitionReplicaChangeEvent{partitionId=176, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[180]. PartitionReplicaChangeEvent{partitionId=180, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[185]. PartitionReplicaChangeEvent{partitionId=185, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[186]. PartitionReplicaChangeEvent{partitionId=186, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[187]. PartitionReplicaChangeEvent{partitionId=187, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[189]. PartitionReplicaChangeEvent{partitionId=189, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[190]. PartitionReplicaChangeEvent{partitionId=190, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[193]. PartitionReplicaChangeEvent{partitionId=193, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[196]. PartitionReplicaChangeEvent{partitionId=196, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[205]. PartitionReplicaChangeEvent{partitionId=205, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[211]. PartitionReplicaChangeEvent{partitionId=211, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[212]. PartitionReplicaChangeEvent{partitionId=212, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[216]. PartitionReplicaChangeEvent{partitionId=216, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[217]. PartitionReplicaChangeEvent{partitionId=217, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[218]. PartitionReplicaChangeEvent{partitionId=218, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[224]. PartitionReplicaChangeEvent{partitionId=224, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[225]. PartitionReplicaChangeEvent{partitionId=225, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[231]. PartitionReplicaChangeEvent{partitionId=231, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[234]. PartitionReplicaChangeEvent{partitionId=234, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[236]. PartitionReplicaChangeEvent{partitionId=236, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[244]. PartitionReplicaChangeEvent{partitionId=244, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[247]. PartitionReplicaChangeEvent{partitionId=247, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[249]. PartitionReplicaChangeEvent{partitionId=249, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[250]. PartitionReplicaChangeEvent{partitionId=250, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[252]. PartitionReplicaChangeEvent{partitionId=252, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[261]. PartitionReplicaChangeEvent{partitionId=261, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[264]. PartitionReplicaChangeEvent{partitionId=264, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[265]. PartitionReplicaChangeEvent{partitionId=265, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
WARNING: [127.0.0.1]:5731 [DataONEBuildTest] Owner of partition is being removed! Possible data loss for partition[267]. PartitionReplicaChangeEvent{partitionId=267, replicaIndex=0, oldAddress=Address[127.0.0.1]:5732, newAddress=null}
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.PartitionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Starting to send partition replica diffs...true
Jul 06, 2016 6:43:40 PM com.hazelcast.cluster.ClusterManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 

Members [1] {
	Member [127.0.0.1]:5731 this
}

Jul 06, 2016 6:43:40 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:41073] lost. Reason: Explicit close
Jul 06, 2016 6:43:40 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Jul 06, 2016 6:43:40 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [1] [localhost/127.0.0.1:5732 -> 127.0.0.1:5732] is lost
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Connection [Address[127.0.0.1]:41074] lost. Reason: Explicit close
Jul 06, 2016 6:43:40 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Jul 06, 2016 6:43:40 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49378
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.SocketAcceptor
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 is accepting socket connection from /127.0.0.1:49377
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49377
Jul 06, 2016 6:43:40 PM com.hazelcast.nio.ConnectionManager
INFO: [127.0.0.1]:5731 [DataONEBuildTest] 5731 accepted socket connection from /127.0.0.1:49378
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49377 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Jul 06, 2016 6:43:40 PM com.hazelcast.impl.ClientHandlerService
INFO: [127.0.0.1]:5731 [DataONEBuildTest] received auth from Connection [/127.0.0.1:49378 -> null] live=true, client=true, type=CLIENT, successfully authenticated
Jul 06, 2016 6:43:40 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Jul 06, 2016 6:43:40 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENING
Jul 06, 2016 6:43:40 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Jul 06, 2016 6:43:40 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_OPENED
Jul 06, 2016 6:43:41 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Destroying node initializer.
Jul 06, 2016 6:43:41 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Hazelcast Shutdown is completed in 1610 ms.
Jul 06, 2016 6:43:41 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5732 [DataONEBuildTest] Address[127.0.0.1]:5732 is SHUTDOWN
Jul 06, 2016 6:43:41 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTTING_DOWN
Jul 06, 2016 6:43:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49377] lost. Reason: Explicit close
Jul 06, 2016 6:43:42 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Jul 06, 2016 6:43:42 PM com.hazelcast.client.ConnectionManager
WARNING: Connection to Connection [2] [localhost/127.0.0.1:5731 -> 127.0.0.1:5731] is lost
Jul 06, 2016 6:43:42 PM com.hazelcast.nio.Connection
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Connection [Address[127.0.0.1]:49378] lost. Reason: Explicit close
Jul 06, 2016 6:43:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Jul 06, 2016 6:43:42 PM com.hazelcast.client.LifecycleServiceClientImpl
INFO: HazelcastClient is CLIENT_CONNECTION_LOST
Jul 06, 2016 6:43:42 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 5,000 ms later, attempt 1 of 1.
Jul 06, 2016 6:43:42 PM com.hazelcast.client.ConnectionManager
INFO: Unable to get alive cluster connection, try in 5,000 ms later, attempt 1 of 1.
Jul 06, 2016 6:43:42 PM com.hazelcast.initializer
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Destroying node initializer.
Jul 06, 2016 6:43:42 PM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Hazelcast Shutdown is completed in 1324 ms.
Jul 06, 2016 6:43:42 PM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5731 [DataONEBuildTest] Address[127.0.0.1]:5731 is SHUTDOWN
20160706-18:43:42: [WARN]: javax.naming.CommunicationException: localhost:9389 connection closed [org.dataone.cn.ldap.DirContextUnsolicitedNotificationListener]
20160706-18:43:42: [INFO]: Unbind of an LDAP service (9389) is complete. [org.apache.directory.server.ldap.LdapServer]
20160706-18:43:42: [INFO]: Sending notice of disconnect to existing clients sessions. [org.apache.directory.server.ldap.LdapServer]
20160706-18:43:42: [INFO]: Ldap service stopped. [org.apache.directory.server.ldap.LdapServer]
20160706-18:43:43: [INFO]: clearing all the caches [org.apache.directory.server.core.api.CacheService]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.924 sec
Running org.dataone.service.cn.replication.v2.TestReplicationPrioritization
20160706-18:43:43: [INFO]: Retrieving performance metrics for the potential replication list for testPid1 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Node node3 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Priority score for node1 is 1.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Priority score for node2 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Priority score for node3 is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Priority score for node4 is 2.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Removed node3, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
20160706-18:43:43: [INFO]: Removed node2, score is 0.0 [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node4
Node: node1
20160706-18:43:43: [INFO]: Node node1 is currently over its request limit of 10 requests. [org.dataone.service.cn.replication.ReplicationPrioritizationStrategy]
Node: node1 request factor: 0.0
Node: node4 request factor: 1.0
Node: node2 request factor: 1.0
Node: node3 request factor: 1.0
Node: node1 request factor: 1.0
Node: node4 request factor: 1.0
Node: node2 request factor: 0.0
Node: node3 request factor: 0.8333333
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.04 sec
20160706-18:43:43: [INFO]: Closing org.springframework.context.support.GenericApplicationContext@afd6b6c: startup date [Wed Jul 06 18:43:15 UTC 2016]; root of context hierarchy [org.springframework.context.support.GenericApplicationContext]
20160706-18:43:43: [INFO]: Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@9102c2e: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,mylog,log4jInitialization,readSystemMetadataResource,nodeListResource,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [org.springframework.beans.factory.support.DefaultListableBeanFactory]

Results :

Tests run: 15, Failures: 0, Errors: 0, Skipped: 0

[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ d1_replication ---
[INFO] Building jar: /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/d1_replication-2.3.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.3:install (default-install) @ d1_replication ---
[INFO] Installing /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/d1_replication-2.3.0-SNAPSHOT.jar to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-SNAPSHOT.jar
[INFO] Installing /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/pom.xml to /var/lib/jenkins/.m2/repository/org/dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-SNAPSHOT.pom
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.2:create (default) @ d1_replication ---
[INFO] Checking for local modifications: skipped.
[INFO] Updating project files from SCM: skipped.
[INFO] Executing: /bin/sh -c cd /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication && svn --non-interactive info
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication
[INFO] Storing buildNumber: 18199 at timestamp: 1467830626421
[INFO] Executing: /bin/sh -c cd /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication && svn --non-interactive info
[INFO] Working directory: /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication
[INFO] Storing buildScmBranch: trunk
[WARNING] Failed to getClass for org.apache.maven.plugin.javadoc.JavadocReport
[INFO] 
[INFO] --- maven-javadoc-plugin:2.9.1:javadoc (default-cli) @ d1_replication ---
[INFO] 
Loading source files for package org.dataone.cn.data.repository...
Loading source files for package org.dataone.service.cn.replication...
Loading source files for package org.dataone.service.cn.replication.v2...
Loading source files for package org.dataone.service.cn.replication.v1...
Constructing Javadoc information...
Standard Doclet version 1.7.0_101
Building tree for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/overview-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/constant-values.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/serialized-form.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTaskRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistoryRepository.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationPostgresRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationAttemptHistory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/class-use/ReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/RejectedReplicationTaskHandler.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/QueuedReplicationAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/MNReplicationTask.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationService.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationStatusMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationRepositoryFactory.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationEventListener.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ApiVersion.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskProcessor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/StaleReplicationRequestAuditor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationPrioritizationStrategy.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskMonitor.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationTaskQueue.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/class-use/ReplicationManager.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/class-use/MNCommunication.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/cn/data/repository/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v1/package-use.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/org/dataone/service/cn/replication/v2/package-use.html...
Building index for all the packages and classes...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/overview-tree.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/index-all.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/deprecated-list.html...
Building index for all classes...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/allclasses-frame.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/allclasses-noframe.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/index.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/overview-summary.html...
Generating /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/site/apidocs/help-doc.html...
16 warnings
[WARNING] Javadoc Warnings
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:258: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:280: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:303: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:499: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:542: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:572: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:622: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:406: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:805: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationManager.java:146: warning - @param argument "repAttemptHistoryRepos" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:140: warning - @param argument "serialVersion" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationService.java:331: warning - @param argument "session" is not a parameter name.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:82: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:99: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:117: warning - @return tag has no arguments.
[WARNING] /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/src/main/java/org/dataone/service/cn/replication/ReplicationTaskQueue.java:206: warning - @return tag has no arguments.
[JENKINS] Archiving  javadoc
Notifying upstream projects of job completion
Join notifier requires a CauseAction
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:03.798s
[INFO] Finished at: Wed Jul 06 18:43:52 UTC 2016
[INFO] Final Memory: 56M/537M
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data
[JENKINS] Archiving /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/pom.xml to org.dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-SNAPSHOT.pom
[JENKINS] Archiving /var/lib/jenkins/jobs/d1_replication/workspace/d1_replication/target/d1_replication-2.3.0-SNAPSHOT.jar to org.dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-SNAPSHOT.jar
channel stopped
Maven RedeployPublisher use remote  maven settings from : /usr/share/maven/conf/settings.xml
[ERROR] uniqueVersion == false is not anymore supported in maven 3
[INFO] Deployment in file:///var/www/maven (id=,uniqueVersion=false)
Deploying the main artifact d1_replication-2.3.0-SNAPSHOT.jar
Downloading: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/maven-metadata.xml
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-20160706.184353-1.jar
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-20160706.184353-1.jar (73 KB at 24182.0 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-20160706.184353-1.pom
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/d1_replication-2.3.0-20160706.184353-1.pom (7 KB)
Downloading: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml
Downloaded: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml (2 KB at 142.3 KB/sec)
Uploading: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/maven-metadata.xml
Uploaded: file:///var/www/maven/org/dataone/d1_replication/2.3.0-SNAPSHOT/maven-metadata.xml (775 B)
Uploading: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml
Uploaded: file:///var/www/maven/org/dataone/d1_replication/maven-metadata.xml (2 KB at 1319.3 KB/sec)
[INFO] Deployment done in 0.19 sec
IRC notifier plugin: Sending notification to: #dataone-build
Notifying upstream projects of job completion
Warning: you have no plugins providing access control for builds, so falling back to legacy behavior of permitting any downstream builds to be triggered
Finished: SUCCESS