Established TCP socket on 40451 <===[JENKINS REMOTING CAPACITY]===>channel started Executing Maven: -B -f /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/pom.xml clean install [INFO] Scanning for projects... [WARNING] [WARNING] Some problems were encountered while building the effective model for org.dataone:d1_cn_index_processor:jar:2.4.0-SNAPSHOT [WARNING] 'build.plugins.plugin.version' for com.mycila.maven-license-plugin:maven-license-plugin is missing. @ line 343, column 15 [WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 335, column 15 [WARNING] 'build.plugins.plugin.version' for org.codehaus.mojo:buildnumber-maven-plugin is missing. @ line 350, column 21 [WARNING] [WARNING] It is highly recommended to fix these problems because they threaten the stability of your build. [WARNING] [WARNING] For this reason, future Maven versions might no longer support building such malformed projects. [WARNING] [INFO] [INFO] -----------------< org.dataone:d1_cn_index_processor >------------------ [INFO] Building d1_cn_index_processor 2.4.0-SNAPSHOT [INFO] --------------------------------[ jar ]--------------------------------- [INFO] Downloading from com.springsource.repository.bundles.release: http://repository.springsource.com/maven/bundles/release/org/dataone/d1_common_java/2.4.0-SNAPSHOT/maven-metadata.xml [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ d1_cn_index_processor --- [INFO] [INFO] --- buildnumber-maven-plugin:1.4:create (default) @ d1_cn_index_processor --- [INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace' && 'svn' '--non-interactive' 'info' [INFO] Working directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace [INFO] Storing buildNumber: null at timestamp: 1571180103056 [INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace' && 'svn' '--non-interactive' 'info' [INFO] Working directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace [INFO] Storing buildScmBranch: UNKNOWN_BRANCH [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ d1_cn_index_processor --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 67 resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ d1_cn_index_processor --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 92 source files to /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/classes [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/resourcemap/ResourceMapDataSource.java: Some input files use or override a deprecated API. [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/resourcemap/ResourceMapDataSource.java: Recompile with -Xlint:deprecation for details. [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/XMLNamespaceConfig.java: Some input files use unchecked or unsafe operations. [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/XMLNamespaceConfig.java: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ d1_cn_index_processor --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 314 resources [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ d1_cn_index_processor --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 50 source files to /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/test-classes [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/InvalidXmlCharTest.java: Some input files use or override a deprecated API. [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/InvalidXmlCharTest.java: Recompile with -Xlint:deprecation for details. [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/DataONESolrJettyTestBase.java: Some input files use unchecked or unsafe operations. [WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/DataONESolrJettyTestBase.java: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.20:test (default-test) @ d1_cn_index_processor --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running org.dataone.cn.indexer.AppTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.dataone.cn.indexer.AppTest [INFO] Running org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest Creating dataDir: /tmp/org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest_33F54DF03727D939-001/init-core-data-001 ERROR IN SolrLogFormatter! original message:Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'. Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.config.ClasspathXmlConfig.(ClasspathXmlConfig.java:35) at com.hazelcast.config.ClasspathXmlConfig.(ClasspathXmlConfig.java:30) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:30) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1] Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.impl.AddressPicker.log(AddressPicker.java:330) at com.hazelcast.impl.AddressPicker.getInterfaces(AddressPicker.java:208) at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:131) at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:51) at com.hazelcast.impl.Node.(Node.java:144) at com.hazelcast.impl.FactoryImpl.(FactoryImpl.java:386) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true. Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.impl.AddressPicker.log(AddressPicker.java:330) at com.hazelcast.impl.AddressPicker.pickLoopbackAddress(AddressPicker.java:262) at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:134) at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:51) at com.hazelcast.impl.Node.(Node.java:144) at com.hazelcast.impl.FactoryImpl.(FactoryImpl.java:386) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:Picked Address[127.0.0.1]:5720, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5720], bind any local is true Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.impl.AddressPicker.log(AddressPicker.java:330) at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:105) at com.hazelcast.impl.Node.(Node.java:144) at com.hazelcast.impl.FactoryImpl.(FactoryImpl.java:386) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5720 Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.base.DefaultNodeInitializer.printNodeInfo(DefaultNodeInitializer.java:50) at com.hazelcast.impl.Node.(Node.java:181) at com.hazelcast.impl.FactoryImpl.(FactoryImpl.java:386) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.base.DefaultNodeInitializer.printNodeInfo(DefaultNodeInitializer.java:52) at com.hazelcast.impl.Node.(Node.java:181) at com.hazelcast.impl.FactoryImpl.(FactoryImpl.java:386) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTING Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:63) at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:59) at com.hazelcast.impl.FactoryImpl.(FactoryImpl.java:387) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Members [1] { Member [127.0.0.1]:5720 this } Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.AbstractJoiner$1.process(AbstractJoiner.java:118) at com.hazelcast.cluster.ClusterService$1.process(ClusterService.java:127) at com.hazelcast.cluster.ClusterService.processProcessable(ClusterService.java:190) at com.hazelcast.cluster.ClusterService.dequeueProcessables(ClusterService.java:256) at com.hazelcast.cluster.ClusterService.run(ClusterService.java:201) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTED Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:63) at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:59) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:175) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119) at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104) at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507) at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38) at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.init(ProvRdfXmlProcessorTest.java:144) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) [ INFO] 2019-10-15 22:55:09,144 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ERROR] 2019-10-15 22:55:09,641 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/node.properties [ERROR] 2019-10-15 22:55:09,647 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/index/d1client.properties [ WARN] 2019-10-15 22:55:09,657 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.util.PerformanceLogger::19) Setting up PerformanceLogger: set to enabled? true [ WARN] 2019-10-15 22:55:10,577 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:55:10,869 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:55:11,270 [coreLoadExecutor-5-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:55:11,274 [coreLoadExecutor-5-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:55:11,366 [coreLoadExecutor-5-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:55:11,503 [coreLoadExecutor-5-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english ERROR IN SolrLogFormatter! original message:Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'. Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.config.ClasspathXmlConfig.(ClasspathXmlConfig.java:35) at com.hazelcast.config.ClasspathXmlConfig.(ClasspathXmlConfig.java:30) at org.dataone.cn.hazelcast.HazelcastClientFactory.getHazelcastClientUsingConfig(HazelcastClientFactory.java:145) at org.dataone.cn.hazelcast.HazelcastClientFactory.getStorageClient(HazelcastClientFactory.java:93) at org.dataone.cn.hazelcast.HazelcastClientFactory.getSystemMetadataMap(HazelcastClientFactory.java:71) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:559) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:322) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:HazelcastClient is STARTING Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83) at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] 5720 is accepting socket connection from /127.0.0.1:50614 Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.nio.SocketAcceptor.log(SocketAcceptor.java:142) at com.hazelcast.nio.SocketAcceptor.log(SocketAcceptor.java:138) at com.hazelcast.nio.SocketAcceptor.access$000(SocketAcceptor.java:28) at com.hazelcast.nio.SocketAcceptor$1.run(SocketAcceptor.java:111) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] 5720 accepted socket connection from /127.0.0.1:50614 Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.nio.ConnectionManager.log(ConnectionManager.java:475) at com.hazelcast.nio.ConnectionManager.assignSocketChannel(ConnectionManager.java:260) at com.hazelcast.nio.SocketAcceptor$1.run(SocketAcceptor.java:122) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] received auth from Connection [/127.0.0.1:50614 -> null] live=true, client=true, type=CLIENT, successfully authenticated Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.ClientHandlerService$ClientAuthenticateHandler.processCall(ClientHandlerService.java:852) at com.hazelcast.impl.ClientHandlerService$ClientOperationHandler.handle(ClientHandlerService.java:1565) at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:57) at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:54) at com.hazelcast.impl.ClientRequestHandler.doRun(ClientRequestHandler.java:63) at com.hazelcast.impl.FallThroughRunnable.run(FallThroughRunnable.java:22) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38) ERROR IN SolrLogFormatter! original message:HazelcastClient is CLIENT_CONNECTION_OPENING Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83) at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:HazelcastClient is CLIENT_CONNECTION_OPENED Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83) at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:HazelcastClient is STARTED Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38) at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83) at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Initializing cluster partition table first arrangement... Exception: java.lang.NullPointerException at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63) at java.util.HashMap.hash(HashMap.java:339) at java.util.HashMap.get(HashMap.java:557) at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279) at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165) at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116) at java.util.logging.StreamHandler.publish(StreamHandler.java:211) at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116) at java.util.logging.Logger.log(Logger.java:738) at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146) at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130) at com.hazelcast.impl.PartitionManager.firstArrangement(PartitionManager.java:160) at com.hazelcast.impl.PartitionManager.getOwner(PartitionManager.java:145) at com.hazelcast.impl.PartitionServiceImpl$3.process(PartitionServiceImpl.java:143) at com.hazelcast.cluster.ClusterService.processProcessable(ClusterService.java:190) at com.hazelcast.cluster.ClusterService.dequeueProcessables(ClusterService.java:256) at com.hazelcast.cluster.ClusterService.run(ClusterService.java:201) at java.lang.Thread.run(Thread.java:748) the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing\-DataProcessor.1.m [ WARN] 2019-10-15 22:55:11,758 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m to its system metadata. However, this event still should be granted for indexing for safe. java.lang.NullPointerException at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115) at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:570) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:322) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing\-Configure.1.m [ WARN] 2019-10-15 22:55:12,158 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing-Configure.1.m to its system metadata. However, this event still should be granted for indexing for safe. java.lang.NullPointerException at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115) at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:570) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:327) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing\-schedule_AW02XX_001CTDXXXXR00_processing.1.m [ WARN] 2019-10-15 22:55:12,178 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m to its system metadata. However, this event still should be granted for indexing for safe. java.lang.NullPointerException at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115) at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:570) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:331) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing.eml.1.xml [ WARN] 2019-10-15 22:55:12,196 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing.eml.1.xml to its system metadata. However, this event still should be granted for indexing for safe. java.lang.NullPointerException at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115) at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:570) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:338) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) the filter is id:ala\-wai\-ns02\-image\-data\-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg [ WARN] 2019-10-15 22:55:12,219 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg to its system metadata. However, this event still should be granted for indexing for safe. java.lang.NullPointerException at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115) at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:570) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:343) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) the filter is id:ala\-wai\-ns02\-ctd\-data.1.txt [ WARN] 2019-10-15 22:55:12,244 [TEST-ProvRdfXmlProcessorTest.testInsertProvResourceMap-seed#[33F54DF03727D939]] (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-ns02-ctd-data.1.txt to its system metadata. However, this event still should be granted for indexing for safe. java.lang.NullPointerException at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115) at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.addSystemMetadata(ProvRdfXmlProcessorTest.java:570) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.insertResource(ProvRdfXmlProcessorTest.java:537) at org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest.testInsertProvResourceMap(ProvRdfXmlProcessorTest.java:349) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845) at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747) at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365) at java.lang.Thread.run(Thread.java:748) Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=? Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ? Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.status=? and indextask0_.tryCount:203) FileNotFound: No certificate installed in the default location: /tmp/x509up_u107 [ WARN] 2019-10-15 22:55:17,449 [pool-1-thread-2] (org.dataone.client.utils.HttpConnectionMonitorService$SingletonHolder::46) Starting monitor thread [ WARN] 2019-10-15 22:55:17,449 [Thread-16] (org.dataone.client.utils.HttpConnectionMonitorService:run:96) Starting monitoring... [ WARN] 2019-10-15 22:55:17,449 [pool-1-thread-2] (org.dataone.client.utils.HttpConnectionMonitorService:addMonitor:65) registering ConnectionManager... [ WARN] 2019-10-15 22:55:17,640 [pool-1-thread-2] (org.dataone.client.v2.itk.D1Client:bestAttemptRefreshNodeLocator:327) Could not refresh D1Client's NodeLocator, using previous one. org.dataone.service.exceptions.ServiceFailure: 404: 404: parser for deserializing HTML not written yet. Providing stripped-down html message body starting next line: HTTP Status 404 – Not FoundType Status ReportMessage /cn/v2/nodeDescription The origin server did not find a current representation for the target resource or is not willing to disclose that one exists.Apache Tomcat/8.5.39 (Ubuntu) at org.dataone.service.util.ExceptionHandler.deserializeHtmlAndThrowException(ExceptionHandler.java:465) at org.dataone.service.util.ExceptionHandler.deserializeAndThrowException(ExceptionHandler.java:403) at org.dataone.service.util.ExceptionHandler.deserializeAndThrowException(ExceptionHandler.java:344) at org.dataone.service.util.ExceptionHandler.filterErrors(ExceptionHandler.java:138) at org.dataone.client.rest.HttpMultipartRestClient.doGetRequest(HttpMultipartRestClient.java:343) at org.dataone.client.rest.HttpMultipartRestClient.doGetRequest(HttpMultipartRestClient.java:328) at org.dataone.client.v2.impl.MultipartCNode.listNodes(MultipartCNode.java:445) at org.dataone.client.v2.impl.SettingsContextNodeLocator.getNodeListFromSettingsCN(SettingsContextNodeLocator.java:116) at org.dataone.client.v2.impl.SettingsContextNodeLocator.(SettingsContextNodeLocator.java:83) at org.dataone.client.v2.itk.D1Client.bestAttemptRefreshNodeLocator(D1Client.java:322) at org.dataone.client.v2.itk.D1Client.getCN(D1Client.java:268) at org.dataone.client.v2.formats.ObjectFormatCache.refreshCache(ObjectFormatCache.java:195) at org.dataone.client.v2.formats.ObjectFormatCache.(ObjectFormatCache.java:96) at org.dataone.client.v2.formats.ObjectFormatCache.(ObjectFormatCache.java:57) at org.dataone.client.v2.formats.ObjectFormatCache$ObjectFormatCacheSingleton.(ObjectFormatCache.java:110) at org.dataone.client.v2.formats.ObjectFormatCache.getInstance(ObjectFormatCache.java:117) at org.dataone.cn.index.processor.IndexTaskProcessor.isDataObject(IndexTaskProcessor.java:896) at org.dataone.cn.index.processor.IndexTaskProcessor.isObjectPathReady(IndexTaskProcessor.java:866) at org.dataone.cn.index.processor.IndexTaskProcessor.doTaskPreChecks(IndexTaskProcessor.java:716) at org.dataone.cn.index.processor.IndexTaskProcessor.processTask(IndexTaskProcessor.java:338) at org.dataone.cn.index.processor.IndexTaskProcessor$1.run(IndexTaskProcessor.java:308) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [ WARN] 2019-10-15 22:55:17,738 [pool-1-thread-2] (org.apache.http.client.protocol.ResponseProcessCookies:processCookies:121) Cookie rejected [JSESSIONID="4A97760AC73E021CAB6911A00AAF5307", version:0, domain:cn-dev.test.dataone.org, path:/metacat, expiry:null] Illegal path attribute "/metacat". Path of origin: "/cn/v2/formats" [ INFO] 2019-10-15 22:55:17,857 [pool-1-thread-1] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:55:17,857 [pool-1-thread-3] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:55:17,857 [pool-1-thread-4] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:55:17,858 [pool-1-thread-3] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@79ebc215 [ INFO] 2019-10-15 22:55:17,858 [pool-1-thread-2] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:55:17,858 [pool-1-thread-5] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:55:17,857 [pool-1-thread-1] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4b4c9eea [ INFO] 2019-10-15 22:55:17,859 [pool-1-thread-5] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@65ae9274 [ INFO] 2019-10-15 22:55:17,859 [pool-1-thread-2] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@34118855 [ INFO] 2019-10-15 22:55:17,860 [pool-1-thread-2] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing-Configure.1.m [ INFO] 2019-10-15 22:55:17,858 [pool-1-thread-3] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m [ INFO] 2019-10-15 22:55:17,858 [pool-1-thread-4] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@c266486 [ INFO] 2019-10-15 22:55:17,862 [pool-1-thread-4] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing.eml.1.xml [ INFO] 2019-10-15 22:55:17,862 [pool-1-thread-2] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:55:17,862 [pool-1-thread-3] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:55:17,860 [pool-1-thread-5] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: ala-wai-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg [ INFO] 2019-10-15 22:55:17,864 [pool-1-thread-5] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:55:17,860 [pool-1-thread-1] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m [ INFO] 2019-10-15 22:55:17,865 [pool-1-thread-1] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:55:17,872 [pool-1-thread-4] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:55:17,874 [pool-1-thread-4] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@673052c0 [ INFO] 2019-10-15 22:55:17,876 [pool-1-thread-4] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:55:17,876 [pool-1-thread-4] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Hibernate: select indextask0_.id as id0_0_, indextask0_.dateSysMetaModified as dateSysM2_0_0_, indextask0_.deleted as deleted0_0_, indextask0_.formatId as formatId0_0_, indextask0_.nextExecution as nextExec5_0_0_, indextask0_.objectPath as objectPath0_0_, indextask0_.pid as pid0_0_, indextask0_.priority as priority0_0_, indextask0_.status as status0_0_, indextask0_.sysMetadata as sysMeta10_0_0_, indextask0_.taskModifiedDate as taskMod11_0_0_, indextask0_.tryCount as tryCount0_0_, indextask0_.version as version0_0_ from index_task indextask0_ where indextask0_.id=? Hibernate: select indextask0_.id as id0_0_, indextask0_.dateSysMetaModified as dateSysM2_0_0_, indextask0_.deleted as deleted0_0_, indextask0_.formatId as formatId0_0_, indextask0_.nextExecution as nextExec5_0_0_, indextask0_.objectPath as objectPath0_0_, indextask0_.pid as pid0_0_, indextask0_.priority as priority0_0_, indextask0_.status as status0_0_, indextask0_.sysMetadata as sysMeta10_0_0_, indextask0_.taskModifiedDate as taskMod11_0_0_, indextask0_.tryCount as tryCount0_0_, indextask0_.version as version0_0_ from index_task indextask0_ where indextask0_.id=? Hibernate: select indextask0_.id as id0_0_, indextask0_.dateSysMetaModified as dateSysM2_0_0_, indextask0_.deleted as deleted0_0_, indextask0_.formatId as formatId0_0_, indextask0_.nextExecution as nextExec5_0_0_, indextask0_.objectPath as objectPath0_0_, indextask0_.pid as pid0_0_, indextask0_.priority as priority0_0_, indextask0_.status as status0_0_, indextask0_.sysMetadata as sysMeta10_0_0_, indextask0_.taskModifiedDate as taskMod11_0_0_, indextask0_.tryCount as tryCount0_0_, indextask0_.version as version0_0_ from index_task indextask0_ where indextask0_.id=? Hibernate: delete from index_task where id=? and version=? Hibernate: delete from index_task where id=? and version=? Hibernate: delete from index_task where id=? and version=? Hibernate: select indextask0_.id as id0_0_, indextask0_.dateSysMetaModified as dateSysM2_0_0_, indextask0_.deleted as deleted0_0_, indextask0_.formatId as formatId0_0_, indextask0_.nextExecution as nextExec5_0_0_, indextask0_.objectPath as objectPath0_0_, indextask0_.pid as pid0_0_, indextask0_.priority as priority0_0_, indextask0_.status as status0_0_, indextask0_.sysMetadata as sysMeta10_0_0_, indextask0_.taskModifiedDate as taskMod11_0_0_, indextask0_.tryCount as tryCount0_0_, indextask0_.version as version0_0_ from index_task indextask0_ where indextask0_.id=? Hibernate: delete from index_task where id=? and version=? Hibernate: select indextask0_.id as id0_0_, indextask0_.dateSysMetaModified as dateSysM2_0_0_, indextask0_.deleted as deleted0_0_, indextask0_.formatId as formatId0_0_, indextask0_.nextExecution as nextExec5_0_0_, indextask0_.objectPath as objectPath0_0_, indextask0_.pid as pid0_0_, indextask0_.priority as priority0_0_, indextask0_.status as status0_0_, indextask0_.sysMetadata as sysMeta10_0_0_, indextask0_.taskModifiedDate as taskMod11_0_0_, indextask0_.tryCount as tryCount0_0_, indextask0_.version as version0_0_ from index_task indextask0_ where indextask0_.id=? [ INFO] 2019-10-15 22:55:18,010 [pool-1-thread-2] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 Hibernate: delete from index_task where id=? and version=? [ INFO] 2019-10-15 22:55:18,011 [pool-1-thread-2] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@11be3b01 [ INFO] 2019-10-15 22:55:18,013 [pool-1-thread-2] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: ala-wai-ns02-ctd-data.1.txt [ INFO] 2019-10-15 22:55:18,015 [pool-1-thread-2] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Hibernate: select indextask0_.id as id0_0_, indextask0_.dateSysMetaModified as dateSysM2_0_0_, indextask0_.deleted as deleted0_0_, indextask0_.formatId as formatId0_0_, indextask0_.nextExecution as nextExec5_0_0_, indextask0_.objectPath as objectPath0_0_, indextask0_.pid as pid0_0_, indextask0_.priority as priority0_0_, indextask0_.status as status0_0_, indextask0_.sysMetadata as sysMeta10_0_0_, indextask0_.taskModifiedDate as taskMod11_0_0_, indextask0_.tryCount as tryCount0_0_, indextask0_.version as version0_0_ from index_task indextask0_ where indextask0_.id=? Hibernate: delete from index_task where id=? and version=? Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ? Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ? Hibernate: select indextask0_.id as id0_, indextask0_.dateSysMetaModified as dateSysM2_0_, indextask0_.deleted as deleted0_, indextask0_.formatId as formatId0_, indextask0_.nextExecution as nextExec5_0_, indextask0_.objectPath as objectPath0_, indextask0_.pid as pid0_, indextask0_.priority as priority0_, indextask0_.status as status0_, indextask0_.sysMetadata as sysMeta10_0_, indextask0_.taskModifiedDate as taskMod11_0_, indextask0_.tryCount as tryCount0_, indextask0_.version as version0_ from index_task indextask0_ where indextask0_.status=? and indextask0_.tryCount:162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:55:37,890 [TEST-ProvRdfXmlProcessorTest.testInit-seed#[33F54DF03727D939]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ INFO] 2019-10-15 22:55:38,162 [TEST-ProvRdfXmlProcessorTest.testProvenanceFields-seed#[33F54DF03727D939]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:55:38,275 [TEST-ProvRdfXmlProcessorTest.testProvenanceFields-seed#[33F54DF03727D939]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:55:38,287 [TEST-ProvRdfXmlProcessorTest.testProvenanceFields-seed#[33F54DF03727D939]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... referencedPid: ala-wai-canal-ns02-ctd-data.1.txt referencedPid: ala-wai-canal-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg referencedPid: ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m referencedPid: ala-wai-canal-ns02-ctd-data.1.txt [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.902 s - in org.dataone.cn.indexer.resourcemap.ProvRdfXmlProcessorTest [INFO] Running org.dataone.cn.indexer.resourcemap.OREResourceMapTest [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.097 s - in org.dataone.cn.indexer.resourcemap.OREResourceMapTest [INFO] Running org.dataone.cn.indexer.convert.TemporalPeriodParsingUtilityTest [ERROR] 2019-10-15 22:55:38,790 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) [ERROR] 2019-10-15 22:55:38,791 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: 2005 [ERROR] 2019-10-15 22:55:38,793 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) [ERROR] 2019-10-15 22:55:38,793 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null [ERROR] 2019-10-15 22:55:38,793 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) [ERROR] 2019-10-15 22:55:38,794 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: 2000 [ERROR] 2019-10-15 22:55:38,795 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) [ERROR] 2019-10-15 22:55:38,795 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.dataone.cn.indexer.convert.TemporalPeriodParsingUtilityTest [INFO] Running org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeDocumentServiceTest [ INFO] 2019-10-15 22:55:39,062 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:55:39,169 [main] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.462 s - in org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeDocumentServiceTest [INFO] Running org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeConverterTest [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeConverterTest [INFO] Running org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypesParserTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypesParserTest [INFO] Running org.dataone.cn.indexer.solrhttp.SolrSchemaBeanConfigTest [ INFO] 2019-10-15 22:55:39,286 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml internal * id * _version_ ore * resourceMap * documents * isDocumentedBy mn_service * isService * serviceCoupling * serviceTitle * serviceDescription * serviceType * serviceEndpoint * serviceInput * serviceOutput scimeta * author * authorSurName * authorGivenName * authorSurNameSort * authorGivenNameSort * authorLastName * abstract * keywords * keyConcept * southBoundCoord * northBoundCoord * westBoundCoord * eastBoundCoord * namedLocation * beginDate * endDate * title * scientificName * relatedOrganizations * datePublished * pubDate * investigator * investigatorText * ogcUrl * sku * LTERSite * origin * originText * titlestr * geoform * presentationCat * purpose * updateDate * edition * originator * originatorText * family * species * genus * kingdom * phylum * order * class * attributeName * attributeLabel * attributeDescription * attributeUnit * attribute * webUrl * contactOrganization * contactOrganizationText * keywordsText * placeKey * noBoundingBox * isSpatial * decade * gcmdKeyword * project * projectText * site * siteText * parameter * parameterText * sensor * sensorText * source * sourceText * term * termText * topic * topicText * fileID * text * geohash_1 * geohash_2 * geohash_3 * geohash_4 * geohash_5 * geohash_6 * geohash_7 * geohash_8 * geohash_9 * funding * funderName * funderIdentifier * awardNumber * awardTitle sem * sem_annotation * sem_annotated_by * sem_annotates * sem_comment sysmeta * identifier * seriesId * fileName * mediaType * mediaTypeProperty * formatId * formatType * size * checksum * checksumAlgorithm * dateUploaded * dateModified * submitter * rightsHolder * authoritativeMN * replicationAllowed * numberReplicas * preferredReplicationMN * blockedReplicationMN * replicaMN * replicaVerifiedDate * replicationStatus * datasource * obsoletes * obsoletedBy * readPermission * writePermission * changePermission * isPublic * dataUrl prov * prov_wasDerivedFrom * prov_wasInformedBy * prov_used * prov_generated * prov_generatedByProgram * prov_generatedByExecution * prov_generatedByUser * prov_usedByProgram * prov_usedByExecution * prov_usedByUser * prov_wasExecutedByExecution * prov_wasExecutedByUser * prov_hasSources * prov_hasDerivations * prov_instanceOfClass [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.026 s - in org.dataone.cn.indexer.solrhttp.SolrSchemaBeanConfigTest [INFO] Running org.dataone.cn.indexer.solrhttp.SolrElementFieldTest bath gom1k.nc : the GoM bathymetry derived from the GMRT and NGDC databases (recommended); NetCDF-3 classic format (total size 760 MB). topo gom1k.nc : the GoM topography (land and water) derived from the GMRT and NGDC databases (recommended); NetCDF-3 classic format (total size 127 MB). bath gom1k GEBCO.nc : the GoM bathymetry derived from the GEBCO and NGDC databases; NetCDF-3 classic format (total size 760 MB). topo gom1k GEBCO.nc : the GoM topography (land and water) derived from the GEBCO and NGDC databases; NetCDF-3 classic format (total size 127 MB). bath guide.pdf 􀀀> The manual of the developed GoM 0:01 o bathymetry Various renders of topography and bathymetry||||| [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.017 s - in org.dataone.cn.indexer.solrhttp.SolrElementFieldTest [INFO] Running org.dataone.cn.indexer.processor.ResourceMapSubprocessorTest hello? [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.indexer.processor.ResourceMapSubprocessorTest [INFO] Running org.dataone.cn.indexer.processor.QueuePrioritizerTest 0. 5.00 5: added: A 1. 5.00 5: added: A 2. 5.00 5: added: A 3. 5.00 5: added: A 4. 5.00 5: added: A 5. 5.00 5: added: A 6. 5.00 5: added: A 7. 5.00 5: added: A 8. 5.00 5: added: A 9. 5.00 5: added: A 10. 5.00 5: added: A 11. 5.00 5: added: A 12. 5.00 5: added: A 13. 5.00 5: added: A 14. 5.00 5: added: A 15. 5.00 5: added: A 16. 5.00 5: added: A 17. 5.00 5: added: A 18. 5.00 5: added: A 19. 5.00 5: added: A 20. 5.00 5: added: A 21. 5.00 5: added: A 22. 5.00 5: added: A 23. 5.00 5: added: A 24. 5.00 5: added: A 25. 5.00 5: added: A 26. 5.00 5: added: A 27. 5.00 5: added: A 28. 5.00 5: added: A 29. 5.00 5: added: A 30. 5.00 5: added: BB 31. 5.00 5: added: BB 32. 5.00 5: added: BB 33. 5.00 5: added: BB 34. 5.00 5: added: BB 35. 5.00 5: added: BB 36. 5.00 5: added: BB 37. 5.00 5: added: BB 38. 5.00 5: added: BB 39. 5.00 5: added: BB 40. 5.00 5: added: BB 41. 5.00 5: added: BB 42. 5.00 5: added: BB 43. 5.00 5: added: BB 44. 5.00 5: added: BB 45. 5.00 5: added: BB 46. 5.00 5: added: BB 47. 5.00 5: added: BB 48. 5.00 5: added: BB 49. 4.00 4: added: BB 50. 3.94 3: added: BB 51. 3.88 3: added: BB 52. 3.83 3: added: BB 53. 3.78 3: added: BB 54. 3.73 3: added: BB 55. 3.68 3: added: BB 56. 3.63 3: added: BB 57. 3.59 3: added: BB 58. 3.54 3: added: BB 59. 3.50 3: added: BB 60. 5.92 5: added: CCC 61. 5.84 5: added: CCC 62. 5.76 5: added: CCC 63. 5.69 5: added: CCC 64. 5.62 5: added: CCC 65. 5.55 5: added: CCC 66. 5.48 5: added: CCC 67. 5.41 5: added: CCC 68. 5.35 5: added: CCC 69. 5.29 5: added: CCC 70. 5.23 5: added: CCC 71. 5.17 5: added: CCC 72. 5.11 5: added: CCC 73. 5.05 5: added: CCC 74. 5.00 5: added: CCC 75. 4.95 4: added: CCC 76. 4.90 4: added: CCC 77. 4.85 4: added: CCC 78. 4.80 4: added: CCC 79. 4.75 4: added: CCC 80. 4.70 4: added: CCC 81. 4.66 4: added: CCC 82. 4.61 4: added: CCC 83. 4.57 4: added: CCC 84. 4.53 4: added: CCC 85. 4.49 4: added: CCC 86. 4.45 4: added: CCC 87. 4.41 4: added: CCC 88. 4.37 4: added: CCC 89. 4.33 4: added: CCC 0. 3.00 3: added: A 1. 3.00 3: added: BB 2. 3.00 3: added: A 3. 3.00 3: added: BB 4. 3.00 3: added: A 5. 3.00 3: added: BB 6. 3.00 3: added: A 7. 3.00 3: added: BB 8. 3.00 3: added: A 9. 3.00 3: added: BB 10. 3.00 3: added: A 11. 3.00 3: added: BB 12. 3.00 3: added: A 13. 3.00 3: added: BB 14. 3.00 3: added: A 15. 3.00 3: added: BB 16. 3.00 3: added: A 17. 3.00 3: added: BB 18. 3.00 3: added: A 19. 3.00 3: added: BB 20. 3.00 3: added: A 21. 3.00 3: added: BB 22. 3.00 3: added: A 23. 3.00 3: added: BB 24. 2.44 2: added: A 25. 2.50 2: added: BB 26. 2.44 2: added: A 27. 2.50 2: added: BB 28. 2.45 2: added: A 29. 2.50 2: added: BB 30. 2.45 2: added: A 31. 2.50 2: added: BB 32. 2.45 2: added: A 33. 2.50 2: added: BB 34. 2.46 2: added: A 35. 2.50 2: added: BB 36. 2.46 2: added: A 37. 2.50 2: added: BB 38. 2.46 2: added: A 39. 2.50 2: added: BB 40. 2.46 2: added: A 41. 2.50 2: added: BB 42. 2.47 2: added: A 43. 2.50 2: added: BB 44. 2.47 2: added: A 45. 2.50 2: added: BB 46. 2.47 2: added: A 47. 2.50 2: added: BB 48. 2.47 2: added: A 49. 2.50 2: added: BB 50. 2.50 2: added: A 51. 2.50 2: added: BB 52. 2.50 2: added: A 53. 2.50 2: added: BB 54. 2.50 2: added: A 55. 2.50 2: added: BB 56. 2.50 2: added: A 57. 2.50 2: added: BB 58. 2.50 2: added: A 59. 2.50 2: added: BB 60. 2.50 2: added: A 61. 2.50 2: added: BB 62. 2.50 2: added: A 63. 2.50 2: added: BB 64. 2.50 2: added: A 65. 2.50 2: added: BB 66. 2.50 2: added: A 67. 2.50 2: added: BB 68. 2.50 2: added: A 69. 2.50 2: added: BB 70. 2.50 2: added: A 71. 2.50 2: added: BB 72. 2.50 2: added: A 73. 2.50 2: added: BB 74. 2.50 2: added: A 75. 2.50 2: added: BB 76. 2.50 2: added: A 77. 2.50 2: added: BB 78. 2.50 2: added: A 79. 2.50 2: added: BB 80. 2.50 2: added: A 81. 2.50 2: added: BB 82. 2.50 2: added: A 83. 2.50 2: added: BB 84. 2.50 2: added: A 85. 2.50 2: added: BB 86. 2.50 2: added: A 87. 2.50 2: added: BB 88. 2.50 2: added: A 89. 2.50 2: added: BB 90. 2.50 2: added: A 91. 2.50 2: added: BB 92. 2.50 2: added: A 93. 2.50 2: added: BB 94. 2.50 2: added: A 95. 2.50 2: added: BB 96. 2.50 2: added: A 97. 2.50 2: added: BB 98. 2.50 2: added: A 99. 2.50 2: added: BB 100. 2.50 2: added: A 101. 2.50 2: added: BB 102. 2.50 2: added: A 103. 2.50 2: added: BB 104. 2.50 2: added: A 105. 2.50 2: added: BB 106. 2.50 2: added: A 107. 2.50 2: added: BB 108. 2.50 2: added: A 109. 2.50 2: added: BB 110. 2.50 2: added: A 111. 2.50 2: added: BB 112. 2.50 2: added: A 113. 2.50 2: added: BB 114. 2.50 2: added: A 115. 2.50 2: added: BB 116. 2.50 2: added: A 117. 2.50 2: added: BB 118. 2.50 2: added: A 119. 2.50 2: added: BB 120. 2.50 2: added: A 121. 2.50 2: added: BB 122. 2.50 2: added: A 123. 2.50 2: added: BB 124. 2.50 2: added: A 125. 2.50 2: added: BB 126. 2.50 2: added: A 127. 2.50 2: added: BB 128. 2.50 2: added: A 129. 2.50 2: added: BB 130. 2.50 2: added: A 131. 2.50 2: added: BB 132. 2.50 2: added: A 133. 2.50 2: added: BB 134. 2.50 2: added: A 135. 2.50 2: added: BB 136. 2.50 2: added: A 137. 2.50 2: added: BB 138. 2.50 2: added: A 139. 2.50 2: added: BB 140. 2.50 2: added: A 141. 2.50 2: added: BB 142. 2.50 2: added: A 143. 2.50 2: added: BB 144. 2.50 2: added: A 145. 2.50 2: added: BB 146. 2.50 2: added: A 147. 2.50 2: added: BB 148. 2.50 2: added: A 149. 2.50 2: added: BB 150. 2.50 2: added: A 151. 2.50 2: added: BB 152. 2.50 2: added: A 153. 2.50 2: added: BB 154. 2.50 2: added: A 155. 2.50 2: added: BB 156. 2.50 2: added: A 157. 2.50 2: added: BB 158. 2.50 2: added: A 159. 2.50 2: added: BB 160. 2.50 2: added: A 161. 2.50 2: added: BB 162. 2.50 2: added: A 163. 2.50 2: added: BB 164. 2.50 2: added: A 165. 2.50 2: added: BB 166. 2.50 2: added: A 167. 2.50 2: added: BB 168. 2.50 2: added: A 169. 2.50 2: added: BB 170. 2.50 2: added: A 171. 2.50 2: added: BB 172. 2.50 2: added: A 173. 2.50 2: added: BB 174. 2.50 2: added: A 175. 2.50 2: added: BB 176. 2.50 2: added: A 177. 2.50 2: added: BB 178. 2.50 2: added: A 179. 2.50 2: added: BB 180. 2.50 2: added: A 181. 2.50 2: added: BB 182. 2.50 2: added: A 183. 2.50 2: added: BB 184. 2.50 2: added: A 185. 2.50 2: added: BB 186. 2.50 2: added: A 187. 2.50 2: added: BB 188. 2.50 2: added: A 189. 2.50 2: added: BB 190. 2.50 2: added: A 191. 2.50 2: added: BB 192. 2.50 2: added: A 193. 2.50 2: added: BB 194. 2.50 2: added: A 195. 2.50 2: added: BB 196. 2.50 2: added: A 197. 2.50 2: added: BB 198. 2.50 2: added: A 199. 2.50 2: added: BB 200. 3.94 3: added: CCC 201. 2.50 2: added: BB 202. 3.88 3: added: CCC 203. 2.50 2: added: BB 204. 3.82 3: added: CCC 205. 2.50 2: added: BB 206. 3.76 3: added: CCC 207. 2.50 2: added: BB 208. 3.70 3: added: CCC 209. 2.50 2: added: BB 210. 3.64 3: added: CCC 211. 2.50 2: added: BB 212. 3.58 3: added: CCC 213. 2.50 2: added: BB 214. 3.52 3: added: CCC 215. 2.50 2: added: BB 216. 3.46 3: added: CCC 217. 2.50 2: added: BB 218. 3.40 3: added: CCC 219. 2.50 2: added: BB 220. 3.34 3: added: CCC 221. 2.50 2: added: BB 222. 3.28 3: added: CCC 223. 2.50 2: added: BB 224. 3.22 3: added: CCC 225. 2.50 2: added: BB 226. 3.16 3: added: CCC 227. 2.50 2: added: BB 228. 3.10 3: added: CCC 229. 2.50 2: added: BB 230. 3.04 3: added: CCC 231. 2.50 2: added: BB 232. 2.98 2: added: CCC 233. 2.50 2: added: BB 234. 2.92 2: added: CCC 235. 2.50 2: added: BB 236. 2.86 2: added: CCC 237. 2.50 2: added: BB 238. 2.80 2: added: CCC 239. 2.50 2: added: BB 240. 2.74 2: added: CCC 241. 2.50 2: added: BB 242. 2.68 2: added: CCC 243. 2.50 2: added: BB 244. 2.62 2: added: CCC 245. 2.50 2: added: BB 246. 2.56 2: added: CCC 247. 2.50 2: added: BB 248. 2.50 2: added: CCC 249. 2.50 2: added: BB 0. 2.00 2: added: A 1. 2.00 2: added: A 2. 2.00 2: added: A 3. 2.00 2: added: A 4. 2.00 2: added: A 5. 2.00 2: added: A 6. 2.00 2: added: A 7. 2.00 2: added: A 8. 2.00 2: added: A 9. 2.00 2: added: A 10. 2.00 2: added: A 11. 2.00 2: added: A 12. 2.00 2: added: A 13. 2.00 2: added: A 14. 2.00 2: added: A 15. 2.00 2: added: A 16. 2.00 2: added: A 17. 2.00 2: added: A 18. 2.00 2: added: A 19. 2.00 2: added: A 20. 2.00 2: added: A 21. 2.00 2: added: A 22. 2.00 2: added: A 23. 2.00 2: added: A 24. 2.00 2: added: A 25. 2.00 2: added: A 26. 2.00 2: added: A 27. 2.00 2: added: A 28. 2.00 2: added: A 29. 2.00 2: added: A 30. 2.00 2: added: A 31. 2.00 2: added: A 32. 2.00 2: added: A 33. 2.00 2: added: A 34. 2.00 2: added: A 35. 2.00 2: added: A 36. 2.00 2: added: A 37. 2.00 2: added: A 38. 2.00 2: added: A 39. 2.00 2: added: A 40. 2.00 2: added: A 41. 2.00 2: added: A 42. 2.00 2: added: A 43. 2.00 2: added: A 44. 2.00 2: added: A 45. 2.00 2: added: A 46. 2.00 2: added: A 47. 2.00 2: added: A 48. 2.00 2: added: A 49. 1.00 1: added: A 50. 1.00 1: added: A 51. 1.00 1: added: A 52. 1.00 1: added: A 53. 1.00 1: added: A 54. 1.00 1: added: A 55. 1.00 1: added: A 56. 1.00 1: added: A 57. 1.00 1: added: A 58. 1.00 1: added: A 59. 1.00 1: added: A 60. 1.00 1: added: A 61. 1.00 1: added: A 62. 1.00 1: added: A 63. 1.00 1: added: A 64. 1.00 1: added: A 65. 1.00 1: added: A 66. 1.00 1: added: A 67. 1.00 1: added: A 68. 1.00 1: added: A 69. 1.00 1: added: A 70. 1.00 1: added: A 71. 1.00 1: added: A 72. 1.00 1: added: A 73. 1.00 1: added: A 74. 1.00 1: added: A 75. 1.00 1: added: A 76. 1.00 1: added: A 77. 1.00 1: added: A 78. 1.00 1: added: A 79. 1.00 1: added: A 80. 1.00 1: added: A 81. 1.00 1: added: A 82. 1.00 1: added: A 83. 1.00 1: added: A 84. 1.00 1: added: A 85. 1.00 1: added: A 86. 1.00 1: added: A 87. 1.00 1: added: A 88. 1.00 1: added: A 89. 1.00 1: added: A 90. 1.00 1: added: A 91. 1.00 1: added: A 92. 1.00 1: added: A 93. 1.00 1: added: A 94. 1.00 1: added: A 95. 1.00 1: added: A 96. 1.00 1: added: A 97. 1.00 1: added: A 98. 1.00 1: added: A 99. 1.00 1: added: A 100. 1.00 1: added: A 101. 1.00 1: added: A 102. 1.00 1: added: A 103. 1.00 1: added: A 104. 1.00 1: added: A 105. 1.00 1: added: A 106. 1.00 1: added: A 107. 1.00 1: added: A 108. 1.00 1: added: A 109. 1.00 1: added: A 110. 1.00 1: added: A 111. 1.00 1: added: A 112. 1.00 1: added: A 113. 1.00 1: added: A 114. 1.00 1: added: A 115. 1.00 1: added: A 116. 1.00 1: added: A 117. 1.00 1: added: A 118. 1.00 1: added: A 119. 1.00 1: added: A 120. 1.00 1: added: A 121. 1.00 1: added: A 122. 1.00 1: added: A 123. 1.00 1: added: A 124. 1.00 1: added: A 125. 1.00 1: added: A 126. 1.00 1: added: A 127. 1.00 1: added: A 128. 1.00 1: added: A 129. 1.00 1: added: A 130. 1.00 1: added: A 131. 1.00 1: added: A 132. 1.00 1: added: A 133. 1.00 1: added: A 134. 1.00 1: added: A 135. 1.00 1: added: A 136. 1.00 1: added: A 137. 1.00 1: added: A 138. 1.00 1: added: A 139. 1.00 1: added: A 140. 1.00 1: added: A 141. 1.00 1: added: A 142. 1.00 1: added: A 143. 1.00 1: added: A 144. 1.00 1: added: A 145. 1.00 1: added: A 146. 1.00 1: added: A 147. 1.00 1: added: A 148. 1.00 1: added: A 149. 1.00 1: added: A 150. 1.00 1: added: A 151. 1.00 1: added: A 152. 1.00 1: added: A 153. 1.00 1: added: A 154. 1.00 1: added: A 155. 1.00 1: added: A 156. 1.00 1: added: A 157. 1.00 1: added: A 158. 1.00 1: added: A 159. 1.00 1: added: A 160. 1.00 1: added: A 161. 1.00 1: added: A 162. 1.00 1: added: A 163. 1.00 1: added: A 164. 1.00 1: added: A 165. 1.00 1: added: A 166. 1.00 1: added: A 167. 1.00 1: added: A 168. 1.00 1: added: A 169. 1.00 1: added: A 170. 1.00 1: added: A 171. 1.00 1: added: A 172. 1.00 1: added: A 173. 1.00 1: added: A 174. 1.00 1: added: A 175. 1.00 1: added: A 176. 1.00 1: added: A 177. 1.00 1: added: A 178. 1.00 1: added: A 179. 1.00 1: added: A 180. 1.00 1: added: A 181. 1.00 1: added: A 182. 1.00 1: added: A 183. 1.00 1: added: A 184. 1.00 1: added: A 185. 1.00 1: added: A 186. 1.00 1: added: A 187. 1.00 1: added: A 188. 1.00 1: added: A 189. 1.00 1: added: A 190. 1.00 1: added: A 191. 1.00 1: added: A 192. 1.00 1: added: A 193. 1.00 1: added: A 194. 1.00 1: added: A 195. 1.00 1: added: A 196. 1.00 1: added: A 197. 1.00 1: added: A 198. 1.00 1: added: A 199. 1.00 1: added: A 200. 2.98 2: added: B 201. 1.02 1: added: A 202. 2.96 2: added: B 203. 1.04 1: added: A 204. 2.94 2: added: B 205. 1.06 1: added: A 206. 2.92 2: added: B 207. 1.08 1: added: A 208. 2.90 2: added: B 209. 1.10 1: added: A 210. 2.88 2: added: B 211. 1.12 1: added: A 212. 2.86 2: added: B 213. 1.14 1: added: A 214. 2.84 2: added: B 215. 1.16 1: added: A 216. 2.82 2: added: B 217. 1.18 1: added: A 218. 2.80 2: added: B 219. 1.20 1: added: A [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.055 s - in org.dataone.cn.indexer.processor.QueuePrioritizerTest [INFO] Running org.dataone.cn.indexer.processor.TestIndexTaskProcessorConcurrency ============= scheduler START ==============Tue Oct 15 22:55:39 UTC 2019 instantiating MockIndexTaskProcessorJob: MockIndexTaskProcessorJob: entering execute... Submitted task 0 Submitted task 1 Starting task: 0 Starting task: 1 Submitted task 2 Submitted task 3 Submitted task 4 Submitted task 5 Submitted task 6 Submitted task 7 Submitted task 8 Submitted task 9 Submitted task 10 Submitted task 11 Submitted task 12 Submitted task 13 Submitted task 14 Submitted task 15 Submitted task 16 Submitted task 17 Submitted task 18 Submitted task 19 Submitted task 20 Submitted task 21 Submitted task 22 Submitted task 23 Submitted task 24 Submitted task 25 Submitted task 26 Submitted task 27 Submitted task 28 Submitted task 29 Submitted task 30 Submitted task 31 Submitted task 32 Submitted task 33 Submitted task 34 Submitted task 35 Submitted task 36 Submitted task 37 Submitted task 38 Submitted task 39 Submitted task 40 Submitted task 41 Submitted task 42 Submitted task 43 Submitted task 44 Submitted task 45 Submitted task 46 Submitted task 47 Submitted task 48 Submitted task 49 Submitted task 50 Submitted task 51 Submitted task 52 Submitted task 53 Submitted task 54 Submitted task 55 Submitted task 56 Submitted task 57 Submitted task 58 Submitted task 59 Submitted task 60 Submitted task 61 Submitted task 62 Submitted task 63 Submitted task 64 Submitted task 65 Submitted task 66 Submitted task 67 Submitted task 68 Submitted task 69 Submitted task 70 Submitted task 71 Submitted task 72 Submitted task 73 Submitted task 74 Submitted task 75 Submitted task 76 Submitted task 77 Submitted task 78 Submitted task 79 Submitted task 80 Submitted task 81 Submitted task 82 Submitted task 83 Submitted task 84 Submitted task 85 Submitted task 86 Submitted task 87 Submitted task 88 Submitted task 89 Submitted task 90 Submitted task 91 Submitted task 92 Submitted task 93 Submitted task 94 Submitted task 95 Submitted task 96 Submitted task 97 Submitted task 98 Submitted task 99 MockIndexTaskProcessorJob...finished execution in (millis) 7 Starting task: 2 after 3000 millis, finishing task: 0 after 3000 millis, finishing task: 1 Starting task: 3 Starting task: 4 after 3000 millis, finishing task: 2 Starting task: 5 ============= attempt to kill the Job ==============Tue Oct 15 22:55:44 UTC 2019 ********************* ProcessorJob interrupt called, calling executorservice shutdown... Job scheduler finish executing all jobs. ============= continue to wait 5 sec... ==============Tue Oct 15 22:55:44 UTC 2019 after 3000 millis, finishing task: 3 after 3000 millis, finishing task: 4 Starting task: 6 Starting task: 7 after 3000 millis, finishing task: 5 Starting task: 8 after 3000 millis, finishing task: 7 after 3000 millis, finishing task: 6 Starting task: 9 Starting task: 10 after 3000 millis, finishing task: 8 Starting task: 11 ============= scheduler SHUTDOWN ==============Tue Oct 15 22:55:49 UTC 2019 after 3000 millis, finishing task: 10 after 3000 millis, finishing task: 9 Starting task: 12 Starting task: 13 after 3000 millis, finishing task: 11 Starting task: 14 after 3000 millis, finishing task: 13 after 3000 millis, finishing task: 12 Starting task: 15 Starting task: 16 after 3000 millis, finishing task: 14 Starting task: 17 after 3000 millis, finishing task: 15 after 3000 millis, finishing task: 16 Starting task: 18 Starting task: 19 after 3000 millis, finishing task: 17 Starting task: 20 after 3000 millis, finishing task: 18 after 3000 millis, finishing task: 19 Starting task: 21 Starting task: 22 after 3000 millis, finishing task: 20 Starting task: 23 after 3000 millis, finishing task: 21 after 3000 millis, finishing task: 22 Starting task: 24 Starting task: 25 after 3000 millis, finishing task: 23 Starting task: 26 after 3000 millis, finishing task: 25 after 3000 millis, finishing task: 24 Starting task: 27 Starting task: 28 after 3000 millis, finishing task: 26 Starting task: 29 after 3000 millis, finishing task: 27 after 3000 millis, finishing task: 28 Starting task: 31 Starting task: 30 after 3000 millis, finishing task: 29 Starting task: 32 after 3000 millis, finishing task: 30 after 3000 millis, finishing task: 31 Starting task: 33 Starting task: 34 after 3000 millis, finishing task: 32 Starting task: 35 after 3000 millis, finishing task: 33 after 3000 millis, finishing task: 34 Starting task: 36 Starting task: 37 after 3000 millis, finishing task: 35 Starting task: 38 after 3000 millis, finishing task: 36 after 3000 millis, finishing task: 37 Starting task: 39 Starting task: 40 after 3000 millis, finishing task: 38 Starting task: 41 ============= DONE !!!!!!!!!! ==============Tue Oct 15 22:56:19 UTC 2019 [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.025 s - in org.dataone.cn.indexer.processor.TestIndexTaskProcessorConcurrency [INFO] Running org.dataone.cn.indexer.processor.ProcessorShutdownTest Callable 1 is sleeping... 1571180179429 Callable 2 is sleeping... 1571180179449 Callable 3 is sleeping... 1571180179469 Callable 4 is sleeping... 1571180179489 Callable 5 is sleeping... 1571180179510 Callable 6 is sleeping... 1571180179530 Callable 7 is sleeping... 1571180179550 Callable 8 is sleeping... 1571180179570 Callable 9 is sleeping... 1571180179591 Callable 10 is sleeping... 1571180179611 Shutting down the executor service... 1571180180413 Try to submit more tasks to shutdown executor... 1571180180413 Exception from executor service while trying to submit tasks to a shutdown executor java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@55c53a33 rejected from java.util.concurrent.ThreadPoolExecutor@53b7f657[Shutting down, pool size = 10, active threads = 10, queued tasks = 40, completed tasks = 0] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134) at org.dataone.cn.indexer.processor.ProcessorShutdownTest.testShutdown(ProcessorShutdownTest.java:41) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:272) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:236) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:386) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:323) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:143) Canceling cancelable tasks... 1571180180417 Task 1 is NOT done. 1571180180417 Task 1 successfully canceled. 1571180180417 Task 2 is NOT done. 1571180180417 Task 2 successfully canceled. 1571180180417 Task 3 is NOT done. 1571180180417 Task 3 successfully canceled. 1571180180417 Task 4 is NOT done. 1571180180417 Task 4 successfully canceled. 1571180180417 Task 5 is NOT done. 1571180180417 Task 5 successfully canceled. 1571180180417 Task 6 is NOT done. 1571180180417 Task 6 successfully canceled. 1571180180417 Task 7 is NOT done. 1571180180417 Task 7 successfully canceled. 1571180180417 Task 8 is NOT done. 1571180180417 Task 8 successfully canceled. 1571180180417 Task 9 is NOT done. 1571180180418 Task 9 successfully canceled. 1571180180418 Task 10 is NOT done. 1571180180418 Task 10 successfully canceled. 1571180180418 Task 11 is NOT done. 1571180180418 Task 11 successfully canceled. 1571180180418 Task 12 is NOT done. 1571180180418 Task 12 successfully canceled. 1571180180418 Task 13 is NOT done. 1571180180418 Task 13 successfully canceled. 1571180180418 Task 14 is NOT done. 1571180180418 Task 14 successfully canceled. 1571180180418 Task 15 is NOT done. 1571180180418 Task 15 successfully canceled. 1571180180418 Task 16 is NOT done. 1571180180418 Task 16 successfully canceled. 1571180180418 Task 17 is NOT done. 1571180180418 Task 17 successfully canceled. 1571180180418 Task 18 is NOT done. 1571180180418 Task 18 successfully canceled. 1571180180418 Task 19 is NOT done. 1571180180418 Task 19 successfully canceled. 1571180180418 Task 20 is NOT done. 1571180180418 Task 20 successfully canceled. 1571180180418 Task 21 is NOT done. 1571180180418 Task 21 successfully canceled. 1571180180418 Task 22 is NOT done. 1571180180418 Task 22 successfully canceled. 1571180180419 Task 23 is NOT done. 1571180180419 Task 23 successfully canceled. 1571180180419 Task 24 is NOT done. 1571180180419 Task 24 successfully canceled. 1571180180419 Task 25 is NOT done. 1571180180419 Task 25 successfully canceled. 1571180180419 Task 26 is NOT done. 1571180180419 Task 26 successfully canceled. 1571180180419 Task 27 is NOT done. 1571180180419 Task 27 successfully canceled. 1571180180419 Task 28 is NOT done. 1571180180419 Task 28 successfully canceled. 1571180180419 Task 29 is NOT done. 1571180180419 Task 29 successfully canceled. 1571180180419 Task 30 is NOT done. 1571180180419 Task 30 successfully canceled. 1571180180419 Task 31 is NOT done. 1571180180419 Task 31 successfully canceled. 1571180180419 Task 32 is NOT done. 1571180180419 Task 32 successfully canceled. 1571180180419 Task 33 is NOT done. 1571180180419 Task 33 successfully canceled. 1571180180419 Task 34 is NOT done. 1571180180419 Task 34 successfully canceled. 1571180180419 Task 35 is NOT done. 1571180180420 Task 35 successfully canceled. 1571180180420 Task 36 is NOT done. 1571180180420 Task 36 successfully canceled. 1571180180420 Task 37 is NOT done. 1571180180420 Task 37 successfully canceled. 1571180180420 Task 38 is NOT done. 1571180180420 Task 38 successfully canceled. 1571180180420 Task 39 is NOT done. 1571180180420 Task 39 successfully canceled. 1571180180420 Task 40 is NOT done. 1571180180420 Task 40 successfully canceled. 1571180180420 Task 41 is NOT done. 1571180180420 Task 41 successfully canceled. 1571180180420 Task 42 is NOT done. 1571180180420 Task 42 successfully canceled. 1571180180420 Task 43 is NOT done. 1571180180420 Task 43 successfully canceled. 1571180180420 Task 44 is NOT done. 1571180180420 Task 44 successfully canceled. 1571180180420 Task 45 is NOT done. 1571180180420 Task 45 successfully canceled. 1571180180420 Task 46 is NOT done. 1571180180420 Task 46 successfully canceled. 1571180180420 Task 47 is NOT done. 1571180180420 Task 47 successfully canceled. 1571180180420 Task 48 is NOT done. 1571180180420 Task 48 successfully canceled. 1571180180420 Task 49 is NOT done. 1571180180421 Task 49 successfully canceled. 1571180180421 Task 50 is NOT done. 1571180180421 Task 50 successfully canceled. 1571180180421 Sleeping 4000ms... 1571180180421 after 3000 millis, finishing task: 39 after 3000 millis, finishing task: 40 Starting task: 42 Starting task: 43 after 3000 millis, finishing task: 41 Starting task: 44 Callable 1 is done. 1571180181450 Callable 2 is done. 1571180181462 Callable 3 is done. 1571180181482 Callable 4 is done. 1571180181502 Callable 5 is done. 1571180181522 Callable 6 is done. 1571180181542 Callable 7 is done. 1571180181562 Callable 8 is done. 1571180181585 Callable 9 is done. 1571180181603 Callable 10 is done. 1571180181625 after 3000 millis, finishing task: 43 after 3000 millis, finishing task: 42 Starting task: 45 Starting task: 46 after 3000 millis, finishing task: 44 Starting task: 47 Starting a hard shutdown of executor... 1571180184421 Done. 1571180184421 [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.013 s - in org.dataone.cn.indexer.processor.ProcessorShutdownTest [INFO] Running org.dataone.cn.indexer.annotation.AnnotatorSubprocessorTest [ INFO] 2019-10-15 22:56:24,718 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:24,811 [main] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.403 s - in org.dataone.cn.indexer.annotation.AnnotatorSubprocessorTest [INFO] Running org.dataone.cn.indexer.annotation.SolrIndexAnnotatorTest Creating dataDir: /tmp/org.dataone.cn.indexer.annotation.SolrIndexAnnotatorTest_D998F3CBEC93099C-001/init-core-data-001 [ INFO] 2019-10-15 22:56:25,098 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:25,193 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:25,238 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:25,490 [coreLoadExecutor-15-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:25,492 [coreLoadExecutor-15-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:25,513 [coreLoadExecutor-15-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:25,544 [coreLoadExecutor-15-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ INFO] 2019-10-15 22:56:25,552 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:25,553 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@56561180 [ INFO] 2019-10-15 22:56:25,553 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:25,585 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:25,585 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5bce621 [ INFO] 2019-10-15 22:56:25,586 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:25,586 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Comparing value for field abstract Doc Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. Solr Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. Comparing value for field keywords Doc Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] Solr Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] Comparing value for field title Doc Value: Augrabies falls National Park census data. Solr Value: Augrabies falls National Park census data. Comparing value for field southBoundCoord Doc Value: 26.0 Solr Value: 26.0 Comparing value for field northBoundCoord Doc Value: 26.0 Solr Value: 26.0 Comparing value for field westBoundCoord Doc Value: -120.31121 Solr Value: -120.31121 Comparing value for field eastBoundCoord Doc Value: -120.31121 Solr Value: -120.31121 Comparing value for field site Doc Value: [Agulhas falls national Park] Solr Value: [Agulhas falls national Park] Comparing value for field beginDate Doc Value: Thu Jan 01 00:00:00 CST 1998 Solr Value: Thu Jan 01 00:00:00 CST 1998 Comparing value for field endDate Doc Value: Fri Feb 13 00:00:00 CST 2004 Solr Value: Fri Feb 13 00:00:00 CST 2004 Comparing value for field author Doc Value: SANParks Solr Value: SANParks Comparing value for field authorSurName Doc Value: SANParks Solr Value: SANParks Comparing value for field authorSurNameSort Doc Value: SANParks Solr Value: SANParks Comparing value for field authorLastName Doc Value: [SANParks, Garcia, Freeman] Solr Value: [SANParks, Garcia, Freeman] Comparing value for field investigator Doc Value: [SANParks, Garcia, Freeman] Solr Value: [SANParks, Garcia, Freeman] Comparing value for field origin Doc Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store] Solr Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store] Comparing value for field contactOrganization Doc Value: [SANParks, The Awesome Store] Solr Value: [SANParks, The Awesome Store] Comparing value for field genus Doc Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus] Solr Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus] Comparing value for field species Doc Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros] Solr Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros] Comparing value for field scientificName Doc Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros] Solr Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros] Comparing value for field attributeName Doc Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile] Solr Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile] Comparing value for field attributeDescription Doc Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile] Solr Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile] Comparing value for field attributeUnit Doc Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless] Solr Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless] Comparing value for field attribute Doc Value: [ID The ID dimensionless, Lat S Lat S dimensionless, Long E Long E dimensionless, Date The date, Stratum Stratum dimensionless, Transect Transect dimensionless, Species The name of species, LatS LatS dimensionless, LongE LongE dimensionless, Total The total dimensionless, Juvenile Juvenile dimensionless, L/R L/R dimensionless, Species The name of species, Stratum Stratum dimensionless, Date The date, SumOfTotal Sum of the total dimensionless, SumOfJuvenile Sum of juvenile dimensionless, Species The name of species, Date The date, SumOfTotal The sum of total dimensionless, SumOfJuvenile Sum of juvenile dimensionless] Solr Value: [ID The ID dimensionless, Lat S Lat S dimensionless, Long E Long E dimensionless, Date The date, Stratum Stratum dimensionless, Transect Transect dimensionless, Species The name of species, LatS LatS dimensionless, LongE LongE dimensionless, Total The total dimensionless, Juvenile Juvenile dimensionless, L/R L/R dimensionless, Species The name of species, Stratum Stratum dimensionless, Date The date, SumOfTotal Sum of the total dimensionless, SumOfJuvenile Sum of juvenile dimensionless, Species The name of species, Date The date, SumOfTotal The sum of total dimensionless, SumOfJuvenile Sum of juvenile dimensionless] Comparing value for field fileID Doc Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4 Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4 Comparing value for field text Doc Value: Augrabies falls National Park census data. SANParks Garcia Freddy SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa Freeman Gordon SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa The Awesome Store Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. SANParks, South Africa Augrabies Falls National Park,South Africa Census data EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature Agulhas falls national Park -120.311210 -120.311210 26.0 26.0 1998 2004-02-13 genus Antidorcas species marsupialis Hartmans Zebra Genus Cercopithecus Species aethiops Vervet monkey Genus Diceros Species bicornis Baboon Genus Equus Species hartmannae Giraffe Genus Giraffa Species camelopardalis Kudu Genus Oreotragus Species oreotragus Gemsbok Genus Oryz Species gazella Eland Genus Papio Species hamadryas Genus Taurotragus Species oryx Black rhino Genus Tragelaphus Species strepsiceros Klipspringer 1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless Solr Value: Augrabies falls National Park census data. SANParks Garcia Freddy SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa Freeman Gordon SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa The Awesome Store Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. SANParks, South Africa Augrabies Falls National Park,South Africa Census data EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature Agulhas falls national Park -120.311210 -120.311210 26.0 26.0 1998 2004-02-13 genus Antidorcas species marsupialis Hartmans Zebra Genus Cercopithecus Species aethiops Vervet monkey Genus Diceros Species bicornis Baboon Genus Equus Species hartmannae Giraffe Genus Giraffa Species camelopardalis Kudu Genus Oreotragus Species oreotragus Gemsbok Genus Oryz Species gazella Eland Genus Papio Species hamadryas Genus Taurotragus Species oryx Black rhino Genus Tragelaphus Species strepsiceros Klipspringer 1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless Comparing value for field geohash_1 Doc Value: [9] Solr Value: [9] Comparing value for field geohash_2 Doc Value: [9k] Solr Value: [9k] Comparing value for field geohash_3 Doc Value: [9kd] Solr Value: [9kd] Comparing value for field geohash_4 Doc Value: [9kd7] Solr Value: [9kd7] Comparing value for field geohash_5 Doc Value: [9kd7y] Solr Value: [9kd7y] Comparing value for field geohash_6 Doc Value: [9kd7ym] Solr Value: [9kd7ym] Comparing value for field geohash_7 Doc Value: [9kd7ym0] Solr Value: [9kd7ym0] Comparing value for field geohash_8 Doc Value: [9kd7ym0h] Solr Value: [9kd7ym0h] Comparing value for field geohash_9 Doc Value: [9kd7ym0hc] Solr Value: [9kd7ym0hc] Comparing value for field isService Doc Value: false Solr Value: false Comparing value for field id Doc Value: peggym.130.4 Solr Value: peggym.130.4 Comparing value for field identifier Doc Value: peggym.130.4 Solr Value: peggym.130.4 Comparing value for field seriesId Doc Value: peggym.130 Solr Value: peggym.130 Comparing value for field formatId Doc Value: eml://ecoinformatics.org/eml-2.1.0 Solr Value: eml://ecoinformatics.org/eml-2.1.0 Comparing value for field formatType Doc Value: METADATA Solr Value: METADATA Comparing value for field size Doc Value: 36281 Solr Value: 36281 Comparing value for field checksum Doc Value: 24426711d5385a9ffa583a13d07af2502884932f Solr Value: 24426711d5385a9ffa583a13d07af2502884932f Comparing value for field submitter Doc Value: dataone_integration_test_user Solr Value: dataone_integration_test_user Comparing value for field checksumAlgorithm Doc Value: SHA-1 Solr Value: SHA-1 Comparing value for field rightsHolder Doc Value: dataone_integration_test_user Solr Value: dataone_integration_test_user Comparing value for field replicationAllowed Doc Value: true Solr Value: true Comparing value for field obsoletes Doc Value: peggym.130.3 Solr Value: peggym.130.3 Comparing value for field dateUploaded Doc Value: Wed Aug 31 15:59:50 CDT 2011 Solr Value: Wed Aug 31 15:59:50 CDT 2011 Comparing value for field dateModified Doc Value: Wed Aug 31 15:59:50 CDT 2011 Solr Value: Wed Aug 31 15:59:50 CDT 2011 Comparing value for field datasource Doc Value: test_documents Solr Value: test_documents Comparing value for field authoritativeMN Doc Value: test_documents Solr Value: test_documents Comparing value for field readPermission Doc Value: [public, dataone_public_user, dataone_test_user] Solr Value: [public, dataone_public_user, dataone_test_user] Comparing value for field writePermission Doc Value: [dataone_integration_test_user] Solr Value: [dataone_integration_test_user] Comparing value for field isPublic Doc Value: true Solr Value: true Comparing value for field dataUrl Doc Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4 Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4 [ INFO] 2019-10-15 22:56:25,641 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:25,642 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@481a2cab [ INFO] 2019-10-15 22:56:25,642 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: annotation.130.4 [ INFO] 2019-10-15 22:56:27,061 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 2 [ INFO] 2019-10-15 22:56:27,062 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@603d7766 [ INFO] 2019-10-15 22:56:27,062 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: annotation.130.4 [ INFO] 2019-10-15 22:56:27,062 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129) other document to process: peggym.130.4 [ INFO] 2019-10-15 22:56:27,068 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = peggym.130.4 [ INFO] 2019-10-15 22:56:27,069 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136) .... version is: 1647501834287316992 [ INFO] 2019-10-15 22:56:27,069 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[D998F3CBEC93099C]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false FIELD NAME=id, VALUE=peggym.130.4 FIELD NAME=identifier, VALUE=peggym.130.4 FIELD NAME=seriesId, VALUE=peggym.130 FIELD NAME=formatId, VALUE=eml://ecoinformatics.org/eml-2.1.0 FIELD NAME=formatType, VALUE=METADATA FIELD NAME=size, VALUE=36281 FIELD NAME=checksum, VALUE=24426711d5385a9ffa583a13d07af2502884932f FIELD NAME=submitter, VALUE=dataone_integration_test_user FIELD NAME=checksumAlgorithm, VALUE=SHA-1 FIELD NAME=rightsHolder, VALUE=dataone_integration_test_user FIELD NAME=replicationAllowed, VALUE=true FIELD NAME=obsoletes, VALUE=peggym.130.3 FIELD NAME=dateUploaded, VALUE=Wed Aug 31 15:59:50 CDT 2011 FIELD NAME=dateModified, VALUE=Wed Aug 31 15:59:50 CDT 2011 FIELD NAME=datasource, VALUE=test_documents FIELD NAME=authoritativeMN, VALUE=test_documents FIELD NAME=readPermission, VALUE=[public, dataone_public_user, dataone_test_user] FIELD NAME=writePermission, VALUE=[dataone_integration_test_user] FIELD NAME=isPublic, VALUE=true FIELD NAME=dataUrl, VALUE=https://cn.dataone.org/cn/v2/resolve/peggym.130.4 FIELD NAME=isService, VALUE=false FIELD NAME=abstract, VALUE=This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. FIELD NAME=keywords, VALUE=[SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] FIELD NAME=title, VALUE=Augrabies falls National Park census data. FIELD NAME=southBoundCoord, VALUE=26.0 FIELD NAME=northBoundCoord, VALUE=26.0 FIELD NAME=westBoundCoord, VALUE=-120.31121 FIELD NAME=eastBoundCoord, VALUE=-120.31121 FIELD NAME=site, VALUE=[Agulhas falls national Park] FIELD NAME=beginDate, VALUE=Thu Jan 01 00:00:00 CST 1998 FIELD NAME=endDate, VALUE=Fri Feb 13 00:00:00 CST 2004 FIELD NAME=author, VALUE=SANParks FIELD NAME=authorSurName, VALUE=SANParks FIELD NAME=authorSurNameSort, VALUE=SANParks FIELD NAME=authorLastName, VALUE=[SANParks, Garcia, Freeman] FIELD NAME=investigator, VALUE=[SANParks, Garcia, Freeman] FIELD NAME=origin, VALUE=[SANParks Freddy Garcia, Gordon Freeman, The Awesome Store] FIELD NAME=contactOrganization, VALUE=[SANParks, The Awesome Store] FIELD NAME=genus, VALUE=[Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus] FIELD NAME=species, VALUE=[marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros] FIELD NAME=scientificName, VALUE=[Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros] FIELD NAME=attributeName, VALUE=[ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile] FIELD NAME=attributeDescription, VALUE=[The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile] FIELD NAME=attributeUnit, VALUE=[dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless] FIELD NAME=attribute, VALUE=[ID The ID dimensionless, Lat S Lat S dimensionless, Long E Long E dimensionless, Date The date, Stratum Stratum dimensionless, Transect Transect dimensionless, Species The name of species, LatS LatS dimensionless, LongE LongE dimensionless, Total The total dimensionless, Juvenile Juvenile dimensionless, L/R L/R dimensionless, Species The name of species, Stratum Stratum dimensionless, Date The date, SumOfTotal Sum of the total dimensionless, SumOfJuvenile Sum of juvenile dimensionless, Species The name of species, Date The date, SumOfTotal The sum of total dimensionless, SumOfJuvenile Sum of juvenile dimensionless] FIELD NAME=fileID, VALUE=https://cn.dataone.org/cn/v2/resolve/peggym.130.4 FIELD NAME=text, VALUE=Augrabies falls National Park census data. SANParks Garcia Freddy SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa Freeman Gordon SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa The Awesome Store Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. SANParks, South Africa Augrabies Falls National Park,South Africa Census data EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature Agulhas falls national Park -120.311210 -120.311210 26.0 26.0 1998 2004-02-13 genus Antidorcas species marsupialis Hartmans Zebra Genus Cercopithecus Species aethiops Vervet monkey Genus Diceros Species bicornis Baboon Genus Equus Species hartmannae Giraffe Genus Giraffa Species camelopardalis Kudu Genus Oreotragus Species oreotragus Gemsbok Genus Oryz Species gazella Eland Genus Papio Species hamadryas Genus Taurotragus Species oryx Black rhino Genus Tragelaphus Species strepsiceros Klipspringer 1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless FIELD NAME=geohash_1, VALUE=[9] FIELD NAME=geohash_2, VALUE=[9k] FIELD NAME=geohash_3, VALUE=[9kd] FIELD NAME=geohash_4, VALUE=[9kd7] FIELD NAME=geohash_5, VALUE=[9kd7y] FIELD NAME=geohash_6, VALUE=[9kd7ym] FIELD NAME=geohash_7, VALUE=[9kd7ym0] FIELD NAME=geohash_8, VALUE=[9kd7ym0h] FIELD NAME=geohash_9, VALUE=[9kd7ym0hc] FIELD NAME=serviceCoupling, VALUE=false FIELD NAME=_version_, VALUE=1647501835846549504 FIELD NAME=sem_annotation, VALUE=[http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass] FIELD NAME=sem_annotated_by, VALUE=[annotation.130.4] annotationValue: http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass Comparing value for field abstract Doc Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. Solr Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. Comparing value for field keywords Doc Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] Solr Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] Comparing value for field title Doc Value: Augrabies falls National Park census data. Solr Value: Augrabies falls National Park census data. Comparing value for field southBoundCoord Doc Value: 26.0 Solr Value: 26.0 Comparing value for field northBoundCoord Doc Value: 26.0 Solr Value: 26.0 Comparing value for field westBoundCoord Doc Value: -120.31121 Solr Value: -120.31121 Comparing value for field eastBoundCoord Doc Value: -120.31121 Solr Value: -120.31121 Comparing value for field site Doc Value: [Agulhas falls national Park] Solr Value: [Agulhas falls national Park] Comparing value for field beginDate Doc Value: Thu Jan 01 00:00:00 CST 1998 Solr Value: Thu Jan 01 00:00:00 CST 1998 Comparing value for field endDate Doc Value: Fri Feb 13 00:00:00 CST 2004 Solr Value: Fri Feb 13 00:00:00 CST 2004 Comparing value for field author Doc Value: SANParks Solr Value: SANParks Comparing value for field authorSurName Doc Value: SANParks Solr Value: SANParks Comparing value for field authorSurNameSort Doc Value: SANParks Solr Value: SANParks Comparing value for field authorLastName Doc Value: [SANParks, Garcia, Freeman] Solr Value: [SANParks, Garcia, Freeman] Comparing value for field investigator Doc Value: [SANParks, Garcia, Freeman] Solr Value: [SANParks, Garcia, Freeman] Comparing value for field origin Doc Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store] Solr Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store] Comparing value for field contactOrganization Doc Value: [SANParks, The Awesome Store] Solr Value: [SANParks, The Awesome Store] Comparing value for field genus Doc Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus] Solr Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus] Comparing value for field species Doc Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros] Solr Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros] Comparing value for field scientificName Doc Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros] Solr Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros] Comparing value for field attributeName Doc Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile] Solr Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile] Comparing value for field attributeDescription Doc Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile] Solr Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile] Comparing value for field attributeUnit Doc Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless] Solr Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless] Comparing value for field attribute Doc Value: [ID The ID dimensionless, Lat S Lat S dimensionless, Long E Long E dimensionless, Date The date, Stratum Stratum dimensionless, Transect Transect dimensionless, Species The name of species, LatS LatS dimensionless, LongE LongE dimensionless, Total The total dimensionless, Juvenile Juvenile dimensionless, L/R L/R dimensionless, Species The name of species, Stratum Stratum dimensionless, Date The date, SumOfTotal Sum of the total dimensionless, SumOfJuvenile Sum of juvenile dimensionless, Species The name of species, Date The date, SumOfTotal The sum of total dimensionless, SumOfJuvenile Sum of juvenile dimensionless] Solr Value: [ID The ID dimensionless, Lat S Lat S dimensionless, Long E Long E dimensionless, Date The date, Stratum Stratum dimensionless, Transect Transect dimensionless, Species The name of species, LatS LatS dimensionless, LongE LongE dimensionless, Total The total dimensionless, Juvenile Juvenile dimensionless, L/R L/R dimensionless, Species The name of species, Stratum Stratum dimensionless, Date The date, SumOfTotal Sum of the total dimensionless, SumOfJuvenile Sum of juvenile dimensionless, Species The name of species, Date The date, SumOfTotal The sum of total dimensionless, SumOfJuvenile Sum of juvenile dimensionless] Comparing value for field fileID Doc Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4 Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4 Comparing value for field text Doc Value: Augrabies falls National Park census data. SANParks Garcia Freddy SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa Freeman Gordon SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa The Awesome Store Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. SANParks, South Africa Augrabies Falls National Park,South Africa Census data EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature Agulhas falls national Park -120.311210 -120.311210 26.0 26.0 1998 2004-02-13 genus Antidorcas species marsupialis Hartmans Zebra Genus Cercopithecus Species aethiops Vervet monkey Genus Diceros Species bicornis Baboon Genus Equus Species hartmannae Giraffe Genus Giraffa Species camelopardalis Kudu Genus Oreotragus Species oreotragus Gemsbok Genus Oryz Species gazella Eland Genus Papio Species hamadryas Genus Taurotragus Species oryx Black rhino Genus Tragelaphus Species strepsiceros Klipspringer 1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless Solr Value: Augrabies falls National Park census data. SANParks Garcia Freddy SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa Freeman Gordon SANParks Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa The Awesome Store Regional Ecologists Private Bag x402 Skukuza, 1350 South Africa This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. SANParks, South Africa Augrabies Falls National Park,South Africa Census data EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature Agulhas falls national Park -120.311210 -120.311210 26.0 26.0 1998 2004-02-13 genus Antidorcas species marsupialis Hartmans Zebra Genus Cercopithecus Species aethiops Vervet monkey Genus Diceros Species bicornis Baboon Genus Equus Species hartmannae Giraffe Genus Giraffa Species camelopardalis Kudu Genus Oreotragus Species oreotragus Gemsbok Genus Oryz Species gazella Eland Genus Papio Species hamadryas Genus Taurotragus Species oryx Black rhino Genus Tragelaphus Species strepsiceros Klipspringer 1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless Comparing value for field geohash_1 Doc Value: [9] Solr Value: [9] Comparing value for field geohash_2 Doc Value: [9k] Solr Value: [9k] Comparing value for field geohash_3 Doc Value: [9kd] Solr Value: [9kd] Comparing value for field geohash_4 Doc Value: [9kd7] Solr Value: [9kd7] Comparing value for field geohash_5 Doc Value: [9kd7y] Solr Value: [9kd7y] Comparing value for field geohash_6 Doc Value: [9kd7ym] Solr Value: [9kd7ym] Comparing value for field geohash_7 Doc Value: [9kd7ym0] Solr Value: [9kd7ym0] Comparing value for field geohash_8 Doc Value: [9kd7ym0h] Solr Value: [9kd7ym0h] Comparing value for field geohash_9 Doc Value: [9kd7ym0hc] Solr Value: [9kd7ym0hc] Comparing value for field isService Doc Value: false Solr Value: false [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.341 s - in org.dataone.cn.indexer.annotation.SolrIndexAnnotatorTest [INFO] Running org.dataone.cn.indexer.annotation.EmlAnnotationSubprocessorTest [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 s - in org.dataone.cn.indexer.annotation.EmlAnnotationSubprocessorTest [INFO] Running org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest after 3000 millis, finishing task: 45 after 3000 millis, finishing task: 46 Starting task: 48 Starting task: 49 after 3000 millis, finishing task: 47 Starting task: 50 [ INFO] 2019-10-15 22:56:27,548 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:27,634 [main] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 annotation: { "pid": "peggym.130.4", "id": "annotation.130.4", "field": "sem_annotation", "reject": false, "ranges": [ { "start": "/section[1]/article[1]/form[1]/section[1]/div[1]/div[1]", "end": "/section[1]/article[1]/form[1]/section[1]/div[1]/div[1]", "startOffset": 0, "endOffset": 4 } ], "permissions": { "read": [ "group:__world__" ], "delete": [], "admin": [], "update": [] }, "user": "CN=Benjamin Leinfelder A515,O=University of Chicago,C=US,DC=cilogon,DC=org", "consumer": "metacat", "updated": "2014-12-03T23:29:20.262152+00:00", "quote": "Data", "oa:Motivation": "oa:tagging", "tags": ["http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass"], "text": "Original annotation content", "created": "2014-12-03T23:09:25.501665+00:00", "uri": "https://cn-dev.test.dataone.org/cn/v1/object/peggym.130.4" } [ERROR] 2019-10-15 22:56:27,660 [main] (org.dataone.cn.indexer.annotation.AnnotatorSubprocessor:processDocument:184) Unable to retrieve solr document: peggym.130.4. Exception attempting to communicate with solr server. java.io.IOException: org.apache.solr.client.solrj.SolrServerException: Server refused connection at: http://localhost:8983/solr/collection1 at org.dataone.cn.indexer.solrhttp.SolrJClient.getDocumentsBySolrId(SolrJClient.java:561) at org.dataone.cn.indexer.solrhttp.SolrJClient.getDocumentsByD1Identifier(SolrJClient.java:495) at org.dataone.cn.indexer.solrhttp.SolrJClient.retrieveDocumentFromSolrServer(SolrJClient.java:815) at org.dataone.cn.indexer.annotation.AnnotatorSubprocessor.processDocument(AnnotatorSubprocessor.java:180) at org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest.compareFields(SolrFieldAnnotatorTest.java:119) at org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest.testAnnotationFields(SolrFieldAnnotatorTest.java:150) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74) at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83) at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:272) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:236) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:386) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:323) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:143) Caused by: org.apache.solr.client.solrj.SolrServerException: Server refused connection at: http://localhost:8983/solr/collection1 at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:567) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:235) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:227) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135) at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:943) at org.apache.solr.client.solrj.SolrClient.getById(SolrClient.java:1174) at org.apache.solr.client.solrj.SolrClient.getById(SolrClient.java:1128) at org.apache.solr.client.solrj.SolrClient.getById(SolrClient.java:1144) at org.dataone.cn.indexer.solrhttp.SolrJClient.getDocumentsBySolrId(SolrJClient.java:555) ... 35 more Caused by: java.net.ConnectException: Connection refused (Connection refused) at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117) at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177) at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304) at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446) at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:466) ... 43 more [ WARN] 2019-10-15 22:56:27,674 [main] (org.dataone.cn.indexer.annotation.AnnotatorSubprocessor:processDocument:189) DID NOT LOCATE REFERENCED DOC: peggym.130.4 Checking value: peggym.130.4 in expected: [peggym.130.4] Checking value: annotation.130.4 in expected: [annotation.130.4] Checking value: http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass in expected: [http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass, http://ecoinformatics.org/oboe/oboe.1.0/oboe-core.owl#PhysicalCharacteristic, http://ecoinformatics.org/oboe/oboe.1.0/oboe-core.owl#Characteristic, http://www.w3.org/2000/01/rdf-schema#Resource] [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.49 s - in org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest [INFO] Running org.dataone.cn.indexer.annotation.SolrIndexEmlAnnotationTest Creating dataDir: /tmp/org.dataone.cn.indexer.annotation.SolrIndexEmlAnnotationTest_9CCA1952FC4B473E-001/init-core-data-001 [ INFO] 2019-10-15 22:56:27,961 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:28,037 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:28,060 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:28,195 [coreLoadExecutor-25-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:28,196 [coreLoadExecutor-25-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:28,212 [coreLoadExecutor-25-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:28,222 [coreLoadExecutor-25-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ INFO] 2019-10-15 22:56:28,228 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:28,228 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1308baf0 [ INFO] 2019-10-15 22:56:28,229 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: eml-test-doc [ INFO] 2019-10-15 22:56:28,231 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:28,231 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@56e63991 [ INFO] 2019-10-15 22:56:28,232 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:28,236 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:28,236 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3281477c [ INFO] 2019-10-15 22:56:28,236 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:28,237 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[9CCA1952FC4B473E]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Comparing value for field abstract Doc Value: Solr Value: Comparing value for field title Doc Value: EML Annotation Example Solr Value: EML Annotation Example Comparing value for field project Doc Value: [MY PROJECT] Solr Value: [MY PROJECT] Comparing value for field funding Doc Value: [SOME_RANDOM_FUNDING_INFO] Solr Value: [SOME_RANDOM_FUNDING_INFO] Comparing value for field funderName Doc Value: [My Funder] Solr Value: [My Funder] Comparing value for field funderIdentifier Doc Value: [MY_FUNDER] Solr Value: [MY_FUNDER] Comparing value for field awardNumber Doc Value: [AWARD1] Solr Value: [AWARD1] Comparing value for field awardTitle Doc Value: [An example award title] Solr Value: [An example award title] Comparing value for field author Doc Value: EML Annotator Solr Value: EML Annotator Comparing value for field authorGivenName Doc Value: EML Solr Value: EML Comparing value for field authorSurName Doc Value: Annotator Solr Value: Annotator Comparing value for field authorGivenNameSort Doc Value: EML Solr Value: EML Comparing value for field authorSurNameSort Doc Value: Annotator Solr Value: Annotator Comparing value for field authorLastName Doc Value: [Annotator] Solr Value: [Annotator] Comparing value for field investigator Doc Value: [Annotator] Solr Value: [Annotator] Comparing value for field origin Doc Value: [EML Annotator] Solr Value: [EML Annotator] Comparing value for field attributeName Doc Value: [SOME_ATTRIBUTE] Solr Value: [SOME_ATTRIBUTE] Comparing value for field attributeDescription Doc Value: [SOME_ATTRIBUTE's definition] Solr Value: [SOME_ATTRIBUTE's definition] Comparing value for field attribute Doc Value: [SOME_ATTRIBUTE SOME_ATTRIBUTE's definition] Solr Value: [SOME_ATTRIBUTE SOME_ATTRIBUTE's definition] Comparing value for field fileID Doc Value: https://cn.dataone.org/cn/v2/resolve/eml-test-doc Solr Value: https://cn.dataone.org/cn/v2/resolve/eml-test-doc Comparing value for field text Doc Value: EML Annotation Example EML Annotator EML Annotator MY PROJECT EML Annotator principalInvestigator SOME_RANDOM_FUNDING_INFO My Funder MY_FUNDER AWARD1 An example award title https://example.org/someaward eml-test-doc SOME_ATTRIBUTE SOME_ATTRIBUTE's definition Solr Value: EML Annotation Example EML Annotator EML Annotator MY PROJECT EML Annotator principalInvestigator SOME_RANDOM_FUNDING_INFO My Funder MY_FUNDER AWARD1 An example award title https://example.org/someaward eml-test-doc SOME_ATTRIBUTE SOME_ATTRIBUTE's definition Comparing value for field isService Doc Value: false Solr Value: false Comparing value for field id Doc Value: eml-test-doc Solr Value: eml-test-doc Comparing value for field identifier Doc Value: eml-test-doc Solr Value: eml-test-doc Comparing value for field formatId Doc Value: https://eml.ecoinformatics.org/eml-2.2.0 Solr Value: https://eml.ecoinformatics.org/eml-2.2.0 Comparing value for field formatType Doc Value: METADATA Solr Value: METADATA Comparing value for field size Doc Value: 0 Solr Value: 0 Comparing value for field checksum Doc Value: 12345 Solr Value: 12345 Comparing value for field submitter Doc Value: dataone_integration_test_user Solr Value: dataone_integration_test_user Comparing value for field checksumAlgorithm Doc Value: MD5 Solr Value: MD5 Comparing value for field rightsHolder Doc Value: dataone_integration_test_user Solr Value: dataone_integration_test_user Comparing value for field replicationAllowed Doc Value: true Solr Value: true Comparing value for field dateUploaded Doc Value: Wed Jul 31 15:59:47 EDT 2019 Solr Value: Wed Jul 31 15:59:47 EDT 2019 Comparing value for field dateModified Doc Value: Wed Jul 31 15:59:47 EDT 2019 Solr Value: Wed Jul 31 15:59:47 EDT 2019 Comparing value for field datasource Doc Value: test_documents Solr Value: test_documents Comparing value for field authoritativeMN Doc Value: test_documents Solr Value: test_documents Comparing value for field readPermission Doc Value: [public, dataone_public_user] Solr Value: [public, dataone_public_user] Comparing value for field writePermission Doc Value: [dataone_integration_test_user] Solr Value: [dataone_integration_test_user] Comparing value for field isPublic Doc Value: true Solr Value: true Comparing value for field dataUrl Doc Value: https://cn.dataone.org/cn/v2/resolve/eml-test-doc Solr Value: https://cn.dataone.org/cn/v2/resolve/eml-test-doc [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.6 s - in org.dataone.cn.indexer.annotation.SolrIndexEmlAnnotationTest [INFO] Running org.dataone.cn.indexer.annotation.OntologyModelServiceTest [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.dataone.cn.indexer.annotation.OntologyModelServiceTest [INFO] Running org.dataone.cn.indexer.annotation.SolrFieldEmlAnnotationTest [ INFO] 2019-10-15 22:56:28,630 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:28,704 [main] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.436 s - in org.dataone.cn.indexer.annotation.SolrFieldEmlAnnotationTest [INFO] Running org.dataone.cn.indexer.XmlDocumentUtilityTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.indexer.XmlDocumentUtilityTest [INFO] Running org.dataone.cn.indexer.parser.TestUpdateAssembler [ INFO] 2019-10-15 22:56:28,729 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml Name: formatId Modifier: set Value: emlversion2 Name: title Modifier: set Value: bestPublicationYet Name: _version_ Modifier: null Value: 1234567890 Name: id Modifier: null Value: MD PID1234567890MD DATA MD ORE MDOREDATA-1 Name: id Modifier: null Value: MD Name: formatId Modifier: null Value: emlversion2 Name: title Modifier: null Value: bestPublicationYet Name: _version_ Modifier: null Value: -1 MDOREDATA-1 id modifier: null resourceMap Modifier: null documents Modifier: null _version_ Modifier: null [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.014 s - in org.dataone.cn.indexer.parser.TestUpdateAssembler [INFO] Running org.dataone.cn.index.SolrFieldDataCiteTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.014 s - in org.dataone.cn.index.SolrFieldDataCiteTest [INFO] Running org.dataone.cn.index.SolrIndexBatchAddTest Creating dataDir: /tmp/org.dataone.cn.index.SolrIndexBatchAddTest_1B84DBA14BC09AB4-001/init-core-data-001 [ INFO] 2019-10-15 22:56:28,949 [TEST-SolrIndexBatchAddTest.testBatchAddCorrect-seed#[1B84DBA14BC09AB4]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:29,014 [TEST-SolrIndexBatchAddTest.testBatchAddCorrect-seed#[1B84DBA14BC09AB4]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:29,044 [TEST-SolrIndexBatchAddTest.testBatchAddCorrect-seed#[1B84DBA14BC09AB4]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:29,187 [coreLoadExecutor-35-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:29,188 [coreLoadExecutor-35-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:29,204 [coreLoadExecutor-35-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:29,213 [coreLoadExecutor-35-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ INFO] 2019-10-15 22:56:29,799 [TEST-SolrIndexBatchAddTest.testBatchAddRuntime-seed#[1B84DBA14BC09AB4]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:29,872 [TEST-SolrIndexBatchAddTest.testBatchAddRuntime-seed#[1B84DBA14BC09AB4]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:29,882 [TEST-SolrIndexBatchAddTest.testBatchAddRuntime-seed#[1B84DBA14BC09AB4]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.143 s - in org.dataone.cn.index.SolrIndexBatchAddTest [INFO] Running org.dataone.cn.index.SolrFieldXPathEmlTest [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.033 s - in org.dataone.cn.index.SolrFieldXPathEmlTest [INFO] Running org.dataone.cn.index.SolrFieldDublinCoreTest [ERROR] 2019-10-15 22:56:29,938 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) [ERROR] 2019-10-15 22:56:29,938 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null [ERROR] 2019-10-15 22:56:29,938 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) [ERROR] 2019-10-15 22:56:29,938 [main] (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null [ERROR] 2019-10-15 22:56:29,939 [main] (org.dataone.cn.indexer.parser.TemporalPeriodSolrField:getFields:79) Couldn't extract 'start' or 'end' date for pid dcterms_spatial_no_namespace. Temporal pattern of type period needs to contain at least one of these. Value was: [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.015 s - in org.dataone.cn.index.SolrFieldDublinCoreTest [INFO] Running org.dataone.cn.index.TestResourceMapIndexTask [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.dataone.cn.index.TestResourceMapIndexTask [INFO] Running org.dataone.cn.index.InvalidXmlCharTest Hibernate: select indextask0_.id as id6_, indextask0_.dateSysMetaModified as dateSysM2_6_, indextask0_.deleted as deleted6_, indextask0_.formatId as formatId6_, indextask0_.nextExecution as nextExec5_6_, indextask0_.objectPath as objectPath6_, indextask0_.pid as pid6_, indextask0_.priority as priority6_, indextask0_.status as status6_, indextask0_.sysMetadata as sysMeta10_6_, indextask0_.taskModifiedDate as taskMod11_6_, indextask0_.tryCount as tryCount6_, indextask0_.version as version6_ from index_task indextask0_ where indextask0_.pid=? field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้ field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้ field value: text/plain field value: DATA field value: 684336 field value: 4504b4dd97f2d7a4766dfaaa3f968ec2 field value: CN=testRightsHolder,DC=dataone,DC=org field value: MD5 field value: CN=testRightsHolder,DC=dataone,DC=org field value: false field value: 2012-03-07T17:26:09.962Z field value: 2012-03-07T17:27:22.879Z field value: urn:node:DEMO3 field value: urn:node:DEMO3 field value: urn:node:DEMO3 field value: completed field value: 2012-03-07T00:00:00.000Z field value: public field value: true field value: https://cn.dataone.org/cn/v2/resolve/testMNodeTier3%3A2012679267486_common-bmp-doc-example-%E0%B8%89%E0%B8%B1%E0%B8%99%E0%B8%81%E0%B8%B4%E0%B8%99%E0%B8%81%E0%B8%A3%E0%B8%B0%E0%B8%88%E0%B8%81%E0%B9%84%E0%B8%94%E0%B9%89 Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) Hibernate: select indextask0_.id as id6_, indextask0_.dateSysMetaModified as dateSysM2_6_, indextask0_.deleted as deleted6_, indextask0_.formatId as formatId6_, indextask0_.nextExecution as nextExec5_6_, indextask0_.objectPath as objectPath6_, indextask0_.pid as pid6_, indextask0_.priority as priority6_, indextask0_.status as status6_, indextask0_.sysMetadata as sysMeta10_6_, indextask0_.taskModifiedDate as taskMod11_6_, indextask0_.tryCount as tryCount6_, indextask0_.version as version6_ from index_task indextask0_ where indextask0_.pid=? field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้ field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้ field value: text/plain field value: DATA field value: 684336 field value: 4504b4dd97f2d7a4766dfaaa3f968ec2 field value: CN=testRightsHolder,DC=dataone,DC=org field value: MD5 field value: CN=testRightsHolder,DC=dataone,DC=org field value: false field value: 2012-03-07T17:26:09.962Z field value: 2012-03-07T17:27:22.879Z field value: urn:node:DEMO3 field value: urn:node:DEMO3 field value: urn:node:DEMO3 field value: completed field value: 2012-03-07T00:00:00.000Z field value: public field value: true field value: https://cn.dataone.org/cn/v2/resolve/testMNodeTier3%3A2012679267486_common-bmp-doc-example-%E0%B8%89%E0%B8%B1%E0%B8%99%E0%B8%81%E0%B8%B4%E0%B8%99%E0%B8%81%E0%B8%A3%E0%B8%B0%E0%B8%88%E0%B8%81%E0%B9%84%E0%B8%94%E0%B9%89 field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้ field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้ field value: text/plain field value: DATA field value: 684336 field value: 4504b4dd97f2d7a4766dfaaa3f968ec2 field value: CN=testRightsHolder,DC=dataone,DC=org field value: MD5 field value: CN=testRightsHolder,DC=dataone,DC=org field value: false field value: 2012-03-07T17:26:09.962Z field value: 2012-03-07T17:27:22.879Z field value: urn:node:DEMO3 field value: urn:node:DEMO3 field value: urn:node:DEMO3 field value: completed field value: 2012-03-07T00:00:00.000Z field value: public field value: true field value: https://cn.dataone.org/cn/v2/resolve/testMNodeTier3%3A2012679267486_common-bmp-doc-example-%E0%B8%89%E0%B8%B1%E0%B8%99%E0%B8%81%E0%B8%B4%E0%B8%99%E0%B8%81%E0%B8%A3%E0%B8%B0%E0%B8%88%E0%B8%81%E0%B9%84%E0%B8%94%E0%B9%89 [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.23 s - in org.dataone.cn.index.InvalidXmlCharTest [INFO] Running org.dataone.cn.index.SolrFieldDublinCoreOAITest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.dataone.cn.index.SolrFieldDublinCoreOAITest [INFO] Running org.dataone.cn.index.SolrFieldIsotc211Test [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.175 s - in org.dataone.cn.index.SolrFieldIsotc211Test [INFO] Running org.dataone.cn.index.SolrSearchIndexQueryTest Creating dataDir: /tmp/org.dataone.cn.index.SolrSearchIndexQueryTest_B76F56BB9AF4B152-001/init-core-data-001 after 3000 millis, finishing task: 48 after 3000 millis, finishing task: 49 Starting task: 51 Starting task: 52 after 3000 millis, finishing task: 50 Starting task: 53 [ INFO] 2019-10-15 22:56:30,638 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:30,715 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:30,736 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:30,871 [coreLoadExecutor-45-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:30,872 [coreLoadExecutor-45-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:30,885 [coreLoadExecutor-45-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:30,892 [coreLoadExecutor-45-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ WARN] 2019-10-15 22:56:30,908 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:30,908 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:30,909 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=5} [ WARN] 2019-10-15 22:56:30,909 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:30,909 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:30,912 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:30,913 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4cad96b [ INFO] 2019-10-15 22:56:30,913 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:30,921 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:30,922 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@7b67ac99 [ INFO] 2019-10-15 22:56:30,922 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:30,923 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:31,146 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:31,208 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:31,216 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:31,237 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:31,237 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:31,238 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=13} [ WARN] 2019-10-15 22:56:31,240 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:31,241 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:31,246 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:31,247 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5f82d17e [ INFO] 2019-10-15 22:56:31,249 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:31,257 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:31,257 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3b14f43 [ INFO] 2019-10-15 22:56:31,258 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:31,258 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[B76F56BB9AF4B152]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.938 s - in org.dataone.cn.index.SolrSearchIndexQueryTest [INFO] Running org.dataone.cn.index.SolrTokenenizerTest Creating dataDir: /tmp/org.dataone.cn.index.SolrTokenenizerTest_2DE3830858E2D7C6-001/init-core-data-001 [ INFO] 2019-10-15 22:56:31,507 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:31,570 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:31,631 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:31,775 [coreLoadExecutor-55-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:31,777 [coreLoadExecutor-55-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:31,792 [coreLoadExecutor-55-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:31,801 [coreLoadExecutor-55-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ WARN] 2019-10-15 22:56:31,811 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:31,812 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:31,812 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:31,812 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:31,813 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:31,817 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:31,817 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6f9b2823 [ INFO] 2019-10-15 22:56:31,817 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:31,830 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:31,836 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@15aab593 [ INFO] 2019-10-15 22:56:31,839 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:31,844 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:31,875 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:31,878 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2bc9a7ea [ INFO] 2019-10-15 22:56:31,881 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:31,885 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:31,888 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1936cf9d [ INFO] 2019-10-15 22:56:31,891 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:31,894 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:32,160 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:32,227 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:32,238 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:32,249 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:32,250 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:32,251 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:32,252 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:32,253 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:32,256 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,256 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3c8f70a1 [ INFO] 2019-10-15 22:56:32,259 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:32,267 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,267 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5ac66e8d [ INFO] 2019-10-15 22:56:32,268 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:32,268 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:32,292 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,292 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3ada3d63 [ INFO] 2019-10-15 22:56:32,292 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:32,294 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,294 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@202c1557 [ INFO] 2019-10-15 22:56:32,295 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:32,295 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:32,541 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:32,621 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:32,630 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:32,640 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:32,640 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:32,641 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:32,641 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:32,641 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:32,644 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,645 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2efbbc58 [ INFO] 2019-10-15 22:56:32,647 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:32,654 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,655 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3b5e1db9 [ INFO] 2019-10-15 22:56:32,655 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:32,655 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:32,677 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,678 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@71babecd [ INFO] 2019-10-15 22:56:32,678 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:32,680 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:32,680 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1efe8c68 [ INFO] 2019-10-15 22:56:32,680 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:32,682 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:32,941 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:33,002 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:33,009 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:33,018 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:33,019 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:33,021 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3} [ WARN] 2019-10-15 22:56:33,022 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:33,024 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:33,030 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,030 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@77cbef9d [ INFO] 2019-10-15 22:56:33,033 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:33,040 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,041 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@27fd1655 [ INFO] 2019-10-15 22:56:33,041 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:33,041 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:33,059 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,060 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6218f7a [ INFO] 2019-10-15 22:56:33,060 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:33,061 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,061 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3a020b0e [ INFO] 2019-10-15 22:56:33,062 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:33,062 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:33,293 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:33,357 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:33,365 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:33,374 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:33,374 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:33,375 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3} [ WARN] 2019-10-15 22:56:33,375 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:33,375 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:33,378 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,378 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@7cc1eec2 [ INFO] 2019-10-15 22:56:33,378 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:33,385 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,386 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@61e84bd5 [ INFO] 2019-10-15 22:56:33,386 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:33,386 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:33,406 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,406 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@19020e77 [ INFO] 2019-10-15 22:56:33,407 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:33,408 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,408 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5a375528 [ INFO] 2019-10-15 22:56:33,409 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:33,409 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false after 3000 millis, finishing task: 51 Starting task: 54 after 3000 millis, finishing task: 52 Starting task: 55 after 3000 millis, finishing task: 53 Starting task: 56 [ INFO] 2019-10-15 22:56:33,652 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:33,719 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:33,727 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:33,736 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:33,737 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:33,737 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3} [ WARN] 2019-10-15 22:56:33,737 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:33,738 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:33,741 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,741 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@33b810fd [ INFO] 2019-10-15 22:56:33,742 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:33,749 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,749 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@56d178ba [ INFO] 2019-10-15 22:56:33,750 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:33,750 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:33,770 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,770 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@344599b4 [ INFO] 2019-10-15 22:56:33,771 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:33,773 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:33,773 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@ecaf3ad [ INFO] 2019-10-15 22:56:33,773 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:33,774 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:34,040 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:34,131 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:34,139 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:34,149 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:34,150 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:34,150 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3} [ WARN] 2019-10-15 22:56:34,150 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:34,151 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:34,153 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,153 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@70dc0131 [ INFO] 2019-10-15 22:56:34,154 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:34,163 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,163 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3f62f78e [ INFO] 2019-10-15 22:56:34,164 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:34,164 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:34,180 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,180 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@fc03e68 [ INFO] 2019-10-15 22:56:34,181 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:34,182 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,182 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5e591813 [ INFO] 2019-10-15 22:56:34,182 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:34,183 [TEST-SolrTokenenizerTest.testQuotations-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false id = peggym.130.4 identifier = peggym.130.4 seriesId = peggym.130 formatId = eml://ecoinformatics.org/eml-2.1.0 formatType = METADATA size = 36281 checksum = 24426711d5385a9ffa583a13d07af2502884932f submitter = dataone_integration_test_user checksumAlgorithm = SHA-1 rightsHolder = dataone_integration_test_user replicationAllowed = true obsoletes = peggym.130.3 dateUploaded = Wed Aug 31 15:59:50 MMT 2011 updateDate = Wed Aug 31 15:59:50 MMT 2011 dateModified = Wed Aug 31 15:59:50 MMT 2011 datasource = test_documents authoritativeMN = test_documents readPermission = [public, dataone_public_user, dataone_test_user] writePermission = [dataone_integration_test_user] isPublic = true dataUrl = https://cn.dataone.org/cn/v2/resolve/peggym.130.4 isService = false abstract = This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. keywords = [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] title = Augrabies falls National Park census data. southBoundCoord = 26.0 northBoundCoord = 26.0 westBoundCoord = -120.31121 eastBoundCoord = -120.31121 site = [Agulhas falls national Park] beginDate = Thu Jan 01 00:00:00 MMT 1998 endDate = Fri Feb 13 00:00:00 MMT 2004 author = SANParks authorSurName = SANParks authorSurNameSort = SANParks authorLastName = [SANParks, Garcia, Freeman] investigator = [SANParks, Garcia, Freeman] origin = [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store] contactOrganization = [SANParks, The Awesome Store] genus = [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus] species = [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros] scientificName = [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros] attributeName = [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile] attributeDescription = [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile] attributeUnit = [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless] attribute = [ID The ID dimensionless, Lat S Lat S dimensionless, Long E Long E dimensionless, Date The date, Stratum Stratum dimensionless, Transect Transect dimensionless, Species The name of species, LatS LatS dimensionless, LongE LongE dimensionless, Total The total dimensionless, Juvenile Juvenile dimensionless, L/R L/R dimensionless, Species The name of species, Stratum Stratum dimensionless, Date The date, SumOfTotal Sum of the total dimensionless, SumOfJuvenile Sum of juvenile dimensionless, Species The name of species, Date The date, SumOfTotal The sum of total dimensionless, SumOfJuvenile Sum of juvenile dimensionless] fileID = https://cn.dataone.org/cn/v2/resolve/peggym.130.4 geohash_1 = [9] geohash_2 = [9k] geohash_3 = [9kd] geohash_4 = [9kd7] geohash_5 = [9kd7y] geohash_6 = [9kd7ym] geohash_7 = [9kd7ym0] geohash_8 = [9kd7ym0h] geohash_9 = [9kd7ym0hc] _version_ = 1647501843279904768 serviceCoupling = false ========================================================== id = tao.12930.1 identifier = tao.12930.1 formatId = eml://ecoinformatics.org/eml-2.1.0 formatType = METADATA size = 68457 checksum = bda6ad5bc761f1f9824ea38b249abde5fc721283 submitter = dataone_integration_test_user checksumAlgorithm = SHA-1 rightsHolder = dataone_integration_test_user replicationAllowed = true dateUploaded = Wed Aug 31 15:59:47 MMT 2011 updateDate = Wed Aug 31 15:59:47 MMT 2011 dateModified = Wed Aug 31 15:59:47 MMT 2011 datasource = test_documents authoritativeMN = test_documents readPermission = [dataone_public_user] writePermission = [dataone_integration_test_user] dataUrl = https://cn.dataone.org/cn/v2/resolve/tao.12930.1 isService = false abstract = title = test again author = tao authorSurName = tao authorSurNameSort = tao authorLastName = [tao] investigator = [tao] origin = [tao] fileID = https://cn.dataone.org/cn/v2/resolve/tao.12930.1 _version_ = 1647501843298779136 serviceCoupling = false ========================================================== [ INFO] 2019-10-15 22:56:34,401 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:34,461 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:34,468 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:34,478 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:34,478 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:34,478 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3} [ WARN] 2019-10-15 22:56:34,479 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:34,479 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:34,482 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,483 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6480e58c [ INFO] 2019-10-15 22:56:34,483 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:34,490 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,490 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@641fcc3e [ INFO] 2019-10-15 22:56:34,490 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:34,491 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:34,506 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,507 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2459fefb [ INFO] 2019-10-15 22:56:34,507 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:34,509 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,509 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@465a2a7d [ INFO] 2019-10-15 22:56:34,509 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:34,510 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:34,712 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:34,766 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:34,772 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ WARN] 2019-10-15 22:56:34,781 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:34,781 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:34,781 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:34,782 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:34,782 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:34,785 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,785 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2e3da048 [ INFO] 2019-10-15 22:56:34,785 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:34,792 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,792 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@43ce9470 [ INFO] 2019-10-15 22:56:34,792 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:34,792 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [ INFO] 2019-10-15 22:56:34,807 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,807 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6a8f544a [ INFO] 2019-10-15 22:56:34,808 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: tao.12930.1 [ INFO] 2019-10-15 22:56:34,809 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:34,809 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@45384721 [ INFO] 2019-10-15 22:56:34,810 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:34,810 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[2DE3830858E2D7C6]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false [INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.529 s - in org.dataone.cn.index.SolrTokenenizerTest [INFO] Running org.dataone.cn.index.SolrRangeQueryTest Creating dataDir: /tmp/org.dataone.cn.index.SolrRangeQueryTest_3504B7D0B5705E3D-001/init-core-data-001 [ INFO] 2019-10-15 22:56:35,026 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:35,077 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:35,094 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:35,253 [coreLoadExecutor-65-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:35,255 [coreLoadExecutor-65-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:35,267 [coreLoadExecutor-65-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:35,276 [coreLoadExecutor-65-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english ***** before delete ********************************* [ WARN] 2019-10-15 22:56:35,287 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:35,288 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:35,288 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:35,288 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:35,289 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... ***** between ********************************** [ INFO] 2019-10-15 22:56:35,292 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:35,292 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@19cd9fde [ INFO] 2019-10-15 22:56:35,293 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:35,300 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:35,300 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@8009f91 [ INFO] 2019-10-15 22:56:35,300 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:35,301 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false ***** after load ********************************* ***** four field range query after 3000 millis, finishing task: 55 after 3000 millis, finishing task: 54 Starting task: 58 Starting task: 57 after 3000 millis, finishing task: 56 Starting task: 59 [ INFO] 2019-10-15 22:56:37,542 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:37,601 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:37,609 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... ***** before delete ********************************* [ WARN] 2019-10-15 22:56:37,621 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:37,621 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:37,622 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:37,622 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:37,622 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... ***** between ********************************** [ INFO] 2019-10-15 22:56:37,626 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:37,626 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@49820bdc [ INFO] 2019-10-15 22:56:37,627 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:37,634 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:37,634 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@79f7c0a7 [ INFO] 2019-10-15 22:56:37,635 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:37,635 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false ***** after load ********************************* ***** two field range query after 3000 millis, finishing task: 58 after 3000 millis, finishing task: 57 Starting task: 60 Starting task: 61 after 3000 millis, finishing task: 59 Starting task: 62 [ INFO] 2019-10-15 22:56:39,840 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:39,893 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:39,899 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... ***** before delete ********************************* [ WARN] 2019-10-15 22:56:39,907 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:39,907 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:39,907 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:39,907 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:39,908 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... ***** between ********************************** [ INFO] 2019-10-15 22:56:39,910 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:39,910 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5491ed33 [ INFO] 2019-10-15 22:56:39,910 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:39,916 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:39,917 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@60e8dcb1 [ INFO] 2019-10-15 22:56:39,917 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:39,917 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[3504B7D0B5705E3D]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false ***** after load ********************************* ***** simpleRangeQuery [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.119 s - in org.dataone.cn.index.SolrRangeQueryTest [INFO] Running org.dataone.cn.index.IndexTaskProcessingIntegrationTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.dataone.cn.index.IndexTaskProcessingIntegrationTest [INFO] Running org.dataone.cn.index.SolrIndexReprocessTest [WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.001 s - in org.dataone.cn.index.SolrIndexReprocessTest [INFO] Running org.dataone.cn.index.SolrIndexFieldTest Creating dataDir: /tmp/org.dataone.cn.index.SolrIndexFieldTest_F64EE65E6ADCDE6E-001/init-core-data-001 [ INFO] 2019-10-15 22:56:42,215 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:42,270 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:42,290 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:42,420 [coreLoadExecutor-75-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:42,420 [coreLoadExecutor-75-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField after ३००० millis, finishing task: ६० after ३००० millis, finishing task: ६१ Starting task: 63 Starting task: 64 after ३००० millis, finishing task: ६२ Starting task: 65 [ WARN] 2019-10-15 22:56:42,431 [coreLoadExecutor-75-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:42,437 [coreLoadExecutor-75-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ INFO] 2019-10-15 22:56:42,441 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:42,442 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@52863cf0 [ INFO] 2019-10-15 22:56:42,442 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: 68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml [ INFO] 2019-10-15 22:56:42,451 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:42,452 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@370da7f0 [ INFO] 2019-10-15 22:56:42,452 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:42,453 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Comparing value for field abstract Doc Value: The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil moisture and climate information system designed to provide data to support natural resource assessments and conservation activities. Administered by the United States Department of Agriculture Natural Resources Conservation Service (NRCS) through the National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil temperature and soil moisture content at several depths, soil water level, air temperature, relative humidity, solar radiation, wind, precipitation, barometric pressure, and more. Solr Value: The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil moisture and climate information system designed to provide data to support natural resource assessments and conservation activities. Administered by the United States Department of Agriculture Natural Resources Conservation Service (NRCS) through the National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil temperature and soil moisture content at several depths, soil water level, air temperature, relative humidity, solar radiation, wind, precipitation, barometric pressure, and more. Comparing value for field beginDate NOTE: reproduce with: ant test -Dtestcase=SolrIndexFieldTest -Dtests.method=testComplexSystemMetadataAndFgdcScienceData -Dtests.seed=F64EE65E6ADCDE6E -Dtests.locale=hi_IN -Dtests.timezone=America/Havana -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [ INFO] 2019-10-15 22:56:42,666 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:42,717 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:42,723 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ INFO] 2019-10-15 22:56:42,726 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:42,726 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@743db9f2 [ INFO] 2019-10-15 22:56:42,728 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns [ INFO] 2019-10-15 22:56:42,732 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:42,733 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@77b5a2fd [ INFO] 2019-10-15 22:56:42,733 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:42,734 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Comparing value for field abstract Doc Value: Adult prawn species, size, sex, reproductive stage, moult stage, and parasites were measured at 20 stations in Albatross Bay, Gulf of Carpentaria. Sampling was carried out monthly between 1986 and 1992. This metadata record is sourced from 'MarLIN', the CSIRO Marine Laboratories Information Network. Solr Value: Adult prawn species, size, sex, reproductive stage, moult stage, and parasites were measured at 20 stations in Albatross Bay, Gulf of Carpentaria. Sampling was carried out monthly between 1986 and 1992. This metadata record is sourced from 'MarLIN', the CSIRO Marine Laboratories Information Network. Comparing value for field beginDate NOTE: reproduce with: ant test -Dtestcase=SolrIndexFieldTest -Dtests.method=testSystemMetadataAndFgdcScienceData -Dtests.seed=F64EE65E6ADCDE6E -Dtests.locale=hi_IN -Dtests.timezone=America/Havana -Dtests.asserts=true -Dtests.file.encoding=UTF-8 [ INFO] 2019-10-15 22:56:42,923 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:42,973 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:42,979 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started... [ INFO] 2019-10-15 22:56:42,982 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:42,982 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@7dd7e439 [ INFO] 2019-10-15 22:56:42,982 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:42,990 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:42,992 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@45d346e0 [ INFO] 2019-10-15 22:56:42,993 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:42,993 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[F64EE65E6ADCDE6E]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false Comparing value for field abstract Doc Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. Solr Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank. This is a test. If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location. Comparing value for field keywords Doc Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] Solr Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature] Comparing value for field title Doc Value: Augrabies falls National Park census data. Solr Value: Augrabies falls National Park census data. Comparing value for field southBoundCoord Doc Value: 26.0 Solr Value: 26.0 Comparing value for field northBoundCoord Doc Value: 26.0 Solr Value: 26.0 Comparing value for field westBoundCoord Doc Value: -120.31121 Solr Value: -120.31121 Comparing value for field eastBoundCoord Doc Value: -120.31121 Solr Value: -120.31121 Comparing value for field site Doc Value: [Agulhas falls national Park] Solr Value: [Agulhas falls national Park] Comparing value for field beginDate NOTE: reproduce with: ant test -Dtestcase=SolrIndexFieldTest -Dtests.method=testSystemMetadataAndEml210ScienceData -Dtests.seed=F64EE65E6ADCDE6E -Dtests.locale=hi_IN -Dtests.timezone=America/Havana -Dtests.asserts=true -Dtests.file.encoding=UTF-8 NOTE: leaving temporary files on disk at: /tmp/org.dataone.cn.index.SolrIndexFieldTest_F64EE65E6ADCDE6E-001 NOTE: test params are: codec=Asserting(Lucene50): {}, docValues:{}, sim=RandomSimilarityProvider(queryNorm=true,coord=crazy): {}, locale=hi_IN, timezone=America/Havana NOTE: Linux 4.15.0-46-generic amd64/Private Build 1.8.0_222 (64-bit)/cpus=8,threads=1,free=812257752,total=1623195648 NOTE: All tests run in this JVM: [ProvRdfXmlProcessorTest, SolrIndexAnnotatorTest, SolrIndexEmlAnnotationTest, SolrIndexBatchAddTest, SolrSearchIndexQueryTest, SolrTokenenizerTest, SolrRangeQueryTest, SolrIndexReprocessTest, SolrIndexFieldTest] [ERROR] Tests run: 4, Failures: 0, Errors: 3, Skipped: 1, Time elapsed: 1.063 s <<< FAILURE! - in org.dataone.cn.index.SolrIndexFieldTest [ERROR] testComplexSystemMetadataAndFgdcScienceData(org.dataone.cn.index.SolrIndexFieldTest) Time elapsed: 0.513 s <<< ERROR! java.lang.IllegalArgumentException: Invalid format: "२०१४-०७-०१T००:००:००.०००Z" at org.dataone.cn.index.SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData(SolrIndexFieldTest.java:82) [ERROR] testSystemMetadataAndFgdcScienceData(org.dataone.cn.index.SolrIndexFieldTest) Time elapsed: 0.258 s <<< ERROR! java.lang.IllegalArgumentException: Invalid format: "१९८६-०३-०१T००:००:००.०००Z" at org.dataone.cn.index.SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData(SolrIndexFieldTest.java:145) [ERROR] testSystemMetadataAndEml210ScienceData(org.dataone.cn.index.SolrIndexFieldTest) Time elapsed: 0.269 s <<< ERROR! java.lang.IllegalArgumentException: Invalid format: "१९९८-०१-०१T०५:००:००.०००Z" at org.dataone.cn.index.SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData(SolrIndexFieldTest.java:117) [INFO] Running org.dataone.cn.index.SolrFieldXPathDryad31Test [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.index.SolrFieldXPathDryad31Test [INFO] Running org.dataone.cn.index.SolrIndexDeleteTest Creating dataDir: /tmp/org.dataone.cn.index.SolrIndexDeleteTest_23A0452F867B0DE4-001/init-core-data-001 [ INFO] 2019-10-15 22:56:43,222 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:56:43,272 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 [ WARN] 2019-10-15 22:56:43,289 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib). [ WARN] 2019-10-15 22:56:43,448 [coreLoadExecutor-85-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType [ WARN] 2019-10-15 22:56:43,449 [coreLoadExecutor-85-thread-1] (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField [ WARN] 2019-10-15 22:56:43,462 [coreLoadExecutor-85-thread-1] (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index... [ WARN] 2019-10-15 22:56:43,467 [coreLoadExecutor-85-thread-1] (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english [ WARN] 2019-10-15 22:56:43,475 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null [ WARN] 2019-10-15 22:56:43,476 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1 [ WARN] 2019-10-15 22:56:43,476 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2} [ WARN] 2019-10-15 22:56:43,476 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0 [ WARN] 2019-10-15 22:56:43,477 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All... [ INFO] 2019-10-15 22:56:43,502 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:43,502 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2b5313fa [ INFO] 2019-10-15 22:56:43,502 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: peggym.130.4 [ INFO] 2019-10-15 22:56:43,508 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:56:43,508 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3dce1e16 [ INFO] 2019-10-15 22:56:43,508 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: null [ INFO] 2019-10-15 22:56:43,509 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[23A0452F867B0DE4]] (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false after 3000 millis, finishing task: 63 after 3000 millis, finishing task: 64 Starting task: 66 Starting task: 67 after 3000 millis, finishing task: 65 Starting task: 68 [WARNING] Tests run: 10, Failures: 0, Errors: 0, Skipped: 9, Time elapsed: 2.44 s - in org.dataone.cn.index.SolrIndexDeleteTest [INFO] Running org.dataone.cn.index.SolrFieldXPathFgdcTest [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 s - in org.dataone.cn.index.SolrFieldXPathFgdcTest [INFO] Running org.dataone.cn.index.HazelcastClientFactoryTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.011 s - in org.dataone.cn.index.HazelcastClientFactoryTest [INFO] [INFO] Results: [INFO] [ERROR] Errors: [ERROR] SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData:82->DataONESolrJettyTestBase.compareFields:203 » IllegalArgument [ERROR] SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData:117->DataONESolrJettyTestBase.compareFields:203 » IllegalArgument [ERROR] SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData:145->DataONESolrJettyTestBase.compareFields:203 » IllegalArgument [INFO] [ERROR] Tests run: 106, Failures: 0, Errors: 3, Skipped: 11 [INFO] [ERROR] There are test failures. Please refer to /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/surefire-reports for the individual test results. Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream. [JENKINS] Recording test results [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ d1_cn_index_processor --- [INFO] Building jar: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/d1_cn_index_processor-2.4.0-SNAPSHOT.jar [INFO] [INFO] --- maven-shade-plugin:1.7.1:shade (default) @ d1_cn_index_processor --- [INFO] Including xerces:xercesImpl:jar:2.9.1 in the shaded jar. [INFO] Including xml-apis:xml-apis:jar:1.3.04 in the shaded jar. [INFO] Including org.apache.solr:solr-solrj:jar:5.2.1 in the shaded jar. [INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.1 in the shaded jar. [INFO] Including org.apache.httpcomponents:httpmime:jar:4.4.1 in the shaded jar. [INFO] Including org.apache.zookeeper:zookeeper:jar:3.4.6 in the shaded jar. [INFO] Including org.codehaus.woodstox:stax2-api:jar:3.1.4 in the shaded jar. [INFO] Including org.codehaus.woodstox:woodstox-core-asl:jar:4.4.1 in the shaded jar. [INFO] Including org.noggit:noggit:jar:0.6 in the shaded jar. [INFO] Including org.apache.httpcomponents:httpclient:jar:4.3.3 in the shaded jar. [INFO] Including commons-configuration:commons-configuration:jar:1.6 in the shaded jar. [INFO] Including commons-fileupload:commons-fileupload:jar:1.2.1 in the shaded jar. [INFO] Including commons-lang:commons-lang:jar:2.6 in the shaded jar. [INFO] Including dom4j:dom4j:jar:1.6.1 in the shaded jar. [INFO] Including joda-time:joda-time:jar:2.2 in the shaded jar. [INFO] Including log4j:log4j:jar:1.2.17 in the shaded jar. [INFO] Including org.ow2.asm:asm:jar:4.1 in the shaded jar. [INFO] Including commons-beanutils:commons-beanutils:jar:1.8.3 in the shaded jar. [INFO] Including commons-codec:commons-codec:jar:1.10 in the shaded jar. [INFO] Including org.dataone:d1_cn_common:jar:2.4.0-SNAPSHOT in the shaded jar. [INFO] Including org.dataone:d1_common_java:jar:2.4.0-SNAPSHOT in the shaded jar. [INFO] Including javax.xml.bind:jaxb-api:jar:2.2.3 in the shaded jar. [INFO] Including javax.xml.stream:stax-api:jar:1.0-2 in the shaded jar. [INFO] Including org.apache.maven.plugins:maven-compiler-plugin:maven-plugin:2.3.1 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1 in the shaded jar. [INFO] Including classworlds:classworlds:jar:1.1-alpha-2 in the shaded jar. [INFO] Including org.apache.maven:maven-plugin-api:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-artifact:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-core:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-settings:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-file:jar:1.0-beta-2 in the shaded jar. [INFO] Including org.apache.maven:maven-plugin-parameter-documenter:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-http-lightweight:jar:1.0-beta-2 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-http-shared:jar:1.0-beta-2 in the shaded jar. [INFO] Including jtidy:jtidy:jar:4aug2000r7-dev in the shaded jar. [INFO] Including org.apache.maven.reporting:maven-reporting-api:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven.doxia:doxia-sink-api:jar:1.0-alpha-7 in the shaded jar. [INFO] Including org.apache.maven:maven-profile:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-provider-api:jar:1.0-beta-2 in the shaded jar. [INFO] Including org.apache.maven:maven-repository-metadata:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-error-diagnostics:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-ssh-external:jar:1.0-beta-2 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-ssh-common:jar:1.0-beta-2 in the shaded jar. [INFO] Including org.apache.maven:maven-plugin-descriptor:jar:2.0.6 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4 in the shaded jar. [INFO] Including org.apache.maven:maven-artifact-manager:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-monitor:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven.wagon:wagon-ssh:jar:1.0-beta-2 in the shaded jar. [INFO] Including com.jcraft:jsch:jar:0.1.27 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-utils:jar:2.0.5 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-compiler-api:jar:1.8 in the shaded jar. [INFO] Including org.apache.maven:maven-toolchain:jar:1.0 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-compiler-manager:jar:1.8 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-compiler-javac:jar:1.8 in the shaded jar. [INFO] Including org.apache.maven.plugins:maven-jar-plugin:maven-plugin:2.3.1 in the shaded jar. [INFO] Including org.apache.maven:maven-project:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-plugin-registry:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-model:jar:2.0.6 in the shaded jar. [INFO] Including org.apache.maven:maven-archiver:jar:2.4.1 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-interpolation:jar:1.13 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-archiver:jar:1.0 in the shaded jar. [INFO] Including org.codehaus.plexus:plexus-io:jar:1.0 in the shaded jar. [INFO] Including org.apache.maven.plugins:maven-clean-plugin:maven-plugin:2.4.1 in the shaded jar. [INFO] Including org.apache.commons:commons-collections4:jar:4.0 in the shaded jar. [INFO] Including com.hazelcast:hazelcast-client:jar:2.4.1 in the shaded jar. [INFO] Including org.apache.commons:commons-pool2:jar:2.4.2 in the shaded jar. [INFO] Including org.dataone:d1_libclient_java:jar:2.4.0-SNAPSHOT in the shaded jar. [INFO] Including net.sf.jsignature.io-tools:easystream:jar:1.2.12 in the shaded jar. [INFO] Including javax.mail:mail:jar:1.4.1 in the shaded jar. [INFO] Including javax.activation:activation:jar:1.1 in the shaded jar. [INFO] Including org.jibx:jibx-run:jar:1.2.4.5 in the shaded jar. [INFO] Including xpp3:xpp3:jar:1.1.3.4.O in the shaded jar. [INFO] Including org.bouncycastle:bcpkix-jdk15on:jar:1.52 in the shaded jar. [INFO] Including org.bouncycastle:bcprov-jdk15on:jar:1.52 in the shaded jar. [INFO] Including org.apache.httpcomponents:httpclient-cache:jar:4.3.6 in the shaded jar. [INFO] Including com.googlecode.foresite-toolkit:foresite:jar:1.0-SNAPSHOT in the shaded jar. [INFO] Including com.hp.hpl.jena:jena:jar:2.5.5 in the shaded jar. [INFO] Including com.hp.hpl.jena:arq:jar:2.2 in the shaded jar. [INFO] Including com.hp.hpl.jena:arq-extra:jar:2.2 in the shaded jar. [INFO] Including com.hp.hpl.jena:jenatest:jar:2.5.5 in the shaded jar. [INFO] Including com.hp.hpl.jena:iri:jar:0.5 in the shaded jar. [INFO] Including com.hp.hpl.jena:concurrent-jena:jar:1.3.2 in the shaded jar. [INFO] Including com.ibm.icu:icu4j:jar:3.4.4 in the shaded jar. [INFO] Including com.hp.hpl.jena:json-jena:jar:1.0 in the shaded jar. [INFO] Including stax:stax-api:jar:1.0 in the shaded jar. [INFO] Including org.codehaus.woodstox:wstx-asl:jar:3.0.0 in the shaded jar. [INFO] Including xerces:xmlParserAPIs:jar:2.0.2 in the shaded jar. [INFO] Including rome:rome:jar:0.9 in the shaded jar. [INFO] Including jdom:jdom:jar:1.0 in the shaded jar. [INFO] Including xalan:xalan:jar:2.7.0 in the shaded jar. [INFO] Including org.dataone:d1_cn_index_common:jar:2.4.0-SNAPSHOT in the shaded jar. [INFO] Including org.dataone:d1_cn_index_generator:jar:2.4.0-SNAPSHOT in the shaded jar. [INFO] Including com.hazelcast:hazelcast:jar:2.4.1 in the shaded jar. [INFO] Including com.hazelcast:hazelcast-spring:jar:2.4.1 in the shaded jar. [INFO] Including org.quartz-scheduler:quartz:jar:2.1.1 in the shaded jar. [INFO] Including c3p0:c3p0:jar:0.9.1.1 in the shaded jar. [INFO] Including org.springframework.data:spring-data-jpa:jar:1.4.5.RELEASE in the shaded jar. [INFO] Including org.aspectj:aspectjrt:jar:1.7.2 in the shaded jar. [INFO] Including org.slf4j:jcl-over-slf4j:jar:1.7.1 in the shaded jar. [INFO] Including org.springframework.data:spring-data-commons:jar:1.6.5.RELEASE in the shaded jar. [INFO] Including org.javassist:javassist:jar:3.18.2-GA in the shaded jar. [INFO] Including org.hibernate:hibernate-entitymanager:jar:3.6.10.Final in the shaded jar. [INFO] Including org.hibernate:hibernate-core:jar:3.6.10.Final in the shaded jar. [INFO] Including antlr:antlr:jar:2.7.6 in the shaded jar. [INFO] Including org.hibernate:hibernate-commons-annotations:jar:3.2.0.Final in the shaded jar. [INFO] Including javax.transaction:jta:jar:1.1 in the shaded jar. [INFO] Including org.hibernate.javax.persistence:hibernate-jpa-2.0-api:jar:1.0.1.Final in the shaded jar. [INFO] Including cglib:cglib:jar:3.1 in the shaded jar. [INFO] Including commons-dbcp:commons-dbcp:jar:1.2.2 in the shaded jar. [INFO] Including commons-pool:commons-pool:jar:1.3 in the shaded jar. [INFO] Including postgresql:postgresql:jar:8.4-702.jdbc4 in the shaded jar. [INFO] Including org.springframework:spring-core:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-asm:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-beans:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-context:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-expression:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-aop:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including aopalliance:aopalliance:jar:1.0 in the shaded jar. [INFO] Including org.springframework:spring-context-support:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-tx:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-orm:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-jdbc:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-web:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including org.springframework:spring-test:jar:3.1.4.RELEASE in the shaded jar. [INFO] Including commons-daemon:commons-daemon:jar:1.0.1 in the shaded jar. [INFO] Including commons-io:commons-io:jar:2.0.1 in the shaded jar. [INFO] Including org.apache.commons:commons-lang3:jar:3.5 in the shaded jar. [INFO] Including commons-cli:commons-cli:jar:1.2 in the shaded jar. [INFO] Including commons-logging:commons-logging:jar:1.1.1 in the shaded jar. [INFO] Including org.slf4j:slf4j-api:jar:1.7.5 in the shaded jar. [INFO] Including org.slf4j:slf4j-log4j12:jar:1.7.5 in the shaded jar. [INFO] Including log4j:apache-log4j-extras:jar:1.2.17 in the shaded jar. [INFO] Including net.minidev:json-smart:jar:1.0.9 in the shaded jar. [INFO] Including org.apache.jena:jena-tdb:jar:3.7.0 in the shaded jar. [INFO] Including org.apache.jena:jena-arq:jar:3.7.0 in the shaded jar. [INFO] Including org.apache.jena:jena-core:jar:3.7.0 in the shaded jar. [INFO] Including org.apache.jena:jena-iri:jar:3.7.0 in the shaded jar. [INFO] Including org.apache.jena:jena-base:jar:3.7.0 in the shaded jar. [INFO] Including org.apache.commons:commons-csv:jar:1.5 in the shaded jar. [INFO] Including com.github.andrewoma.dexx:collection:jar:0.7 in the shaded jar. [INFO] Including org.apache.jena:jena-shaded-guava:jar:3.7.0 in the shaded jar. [INFO] Including com.github.jsonld-java:jsonld-java:jar:0.11.1 in the shaded jar. [INFO] Including com.fasterxml.jackson.core:jackson-core:jar:2.9.0 in the shaded jar. [INFO] Including com.fasterxml.jackson.core:jackson-databind:jar:2.9.0 in the shaded jar. [INFO] Including com.fasterxml.jackson.core:jackson-annotations:jar:2.9.0 in the shaded jar. [INFO] Including org.apache.thrift:libthrift:jar:0.10.0 in the shaded jar. [INFO] Including commons-collections:commons-collections:jar:3.2.1 in the shaded jar. [INFO] Including ch.hsr:geohash:jar:1.0.10 in the shaded jar. [INFO] Including net.sf.saxon:Saxon-HE:jar:9.9.1-3 in the shaded jar. [WARNING] We have a duplicate org/w3c/dom/Attr.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/CDATASection.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/CharacterData.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Comment.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/DOMException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/DOMImplementation.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Document.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/DocumentFragment.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/DocumentType.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Element.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Entity.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/EntityReference.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/NamedNodeMap.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Node.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/NodeList.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Notation.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/ProcessingInstruction.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/w3c/dom/Text.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/AttributeList.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/Attributes.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/ContentHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/DTDHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/DocumentHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/EntityResolver.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/ErrorHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/HandlerBase.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/InputSource.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/Locator.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/Parser.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/SAXException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/SAXNotRecognizedException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/SAXNotSupportedException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/SAXParseException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/XMLFilter.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate org/xml/sax/XMLReader.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar [WARNING] We have a duplicate javax/xml/namespace/QName.class in /var/lib/jenkins/.m2/repository/xpp3/xpp3/1.1.3.4.O/xpp3-1.1.3.4.O.jar [WARNING] We have a duplicate javax/xml/XMLConstants.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/namespace/NamespaceContext.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/namespace/QName.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/EventFilter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/FactoryConfigurationError.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/FactoryFinder$1.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/FactoryFinder$ClassLoaderFinder.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/FactoryFinder$ClassLoaderFinderConcrete.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/FactoryFinder.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/Location.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/StreamFilter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLEventFactory.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLEventReader.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLEventWriter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLInputFactory.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLOutputFactory.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLReporter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLResolver.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLStreamConstants.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLStreamException.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLStreamReader.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/XMLStreamWriter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/Attribute.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/Characters.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/Comment.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/DTD.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/EndDocument.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/EndElement.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/EntityDeclaration.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/EntityReference.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/Namespace.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/NotationDeclaration.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/ProcessingInstruction.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/StartDocument.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/StartElement.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/events/XMLEvent.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/util/EventReaderDelegate.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/util/StreamReaderDelegate.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/util/XMLEventAllocator.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate javax/xml/stream/util/XMLEventConsumer.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/CommonConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/ReaderConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/ValidatorConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/WriterConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/WstxInputProperties$ParsingMode.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/WstxInputProperties.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/api/WstxOutputProperties.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/cfg/ErrorConsts.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/cfg/InputConfigFlags.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/cfg/OutputConfigFlags.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/cfg/ParsingErrorMsgs.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/cfg/XmlConsts.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/ChoiceContentSpec$Validator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/ChoiceContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/ChoiceModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/ConcatModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/ContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DFAState.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DFAValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDAttribute.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDCdataAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDEntitiesAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDEntityAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDEnumAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDId.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDIdAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDIdRefAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDIdRefsAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDNmTokenAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDNmTokensAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDNotationAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDSchemaFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDSubset.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDSubsetImpl.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDTypingNonValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDValidatorBase.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DTDWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DefaultAttrValue$UndeclaredEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/DefaultAttrValue.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/EmptyValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/FullDTDReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/MinimalDTDReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/ModelNode.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/OptionalModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/SeqContentSpec$Validator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/SeqContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/StarModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/StructValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/TokenContentSpec$Validator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/TokenContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/dtd/TokenModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/ent/EntityDecl.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/ent/ExtEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/ent/IntEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/ent/ParsedExtEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/ent/UnparsedExtEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/BaseStartElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/CompactStartElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/DefaultEventAllocator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/MergedNsContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/SimpleStartElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/WDTD.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/WEntityDeclaration.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/WEntityReference.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/WNotationDeclaration.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/evt/WstxEventReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxEOFException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxIOException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxLazyException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxOutputException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxParsingException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxUnexpectedCharException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/exc/WstxValidationException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/AsciiReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/BaseInputSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/BaseReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/BranchingReaderSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/BufferRecycler.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/CharArraySource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/CharsetNames.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/DefaultInputResolver.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/ISOLatinReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/InputBootstrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/InputSourceFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/MergedReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/MergedStream.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/ReaderBootstrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/ReaderSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/StreamBootstrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/TextEscaper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/UTF32Reader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/UTF8Reader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/UTF8Writer.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/WstxInputData.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/WstxInputLocation.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/io/WstxInputSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/msv/AttributeProxy.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/msv/RelaxNGSchema.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/msv/RelaxNGSchemaFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/AttributeCollector.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/BasicStreamReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/CompactNsContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/ElemAttrs.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/ElemCallback.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/InputElementStack.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/InputProblemReporter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/NsDefaultProvider.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/ReaderCreator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/StreamReaderImpl.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/StreamScanner.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sr/ValidatingStreamReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/stax/WstxEventFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/stax/WstxInputFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/stax/WstxOutputFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/AsciiXmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/BaseNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/BaseStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/BufferingXmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/EncodingXmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/ISOLatin1XmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/NonNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/RepairingNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/SimpleNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/SimpleOutputElement$AttrName.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/SimpleOutputElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriterWrapper$RawWrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriterWrapper$TextWrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriterWrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/ArgUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/BaseNsContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/BijectiveNsMap.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/DataUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/DefaultXmlSymbolTable.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/EmptyNamespaceContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/ExceptionUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/InternCache.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/SimpleCache.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/StringUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/StringVector.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/SymbolTable$Bucket.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/SymbolTable.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/TextAccumulator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/TextBuffer$BufferReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/TextBuffer.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/TextBuilder.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/URLUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/WordResolver$1.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/WordResolver$Builder.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/WordResolver.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/WordSet$Builder.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/WordSet.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate com/ctc/wstx/util/XmlChars.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/AttributeInfo.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/DTDInfo.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/LocationInfo.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLEventReader2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLInputFactory2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLOutputFactory2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLStreamLocation2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLStreamProperties.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLStreamReader2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/XMLStreamWriter2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/evt/DTD2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/evt/XMLEvent2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/evt/XMLEventFactory2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/EscapingWriterFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2BlockResult.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2BlockSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2ByteArraySource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2CharArraySource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2FileResult.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2FileSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2ReferentialResult.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2ReferentialSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2Result.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2Source.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2StringSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/io/Stax2URLSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/AttributeContainer.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/DTDValidationSchema.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/Validatable.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/ValidationContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/ValidationProblemHandler.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/ValidatorPair.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationProblem.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationSchema.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationSchemaFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar [WARNING] We have a duplicate org/xml/sax/InputSource.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/SAXException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/EntityResolver.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/ErrorHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/SAXNotRecognizedException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/SAXNotSupportedException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/SAXParseException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/Locator.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/Parser.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/XMLReader.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/ContentHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/DocumentHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/DTDHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/ext/DeclHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/ext/LexicalHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/AttributeList.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/Attributes.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/HandlerBase.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/DefaultHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/NewInstance.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/ParserFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/NamespaceSupport$Context.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/NamespaceSupport.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/AttributesImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/XMLReaderAdapter$AttributesAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/XMLReaderAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/XMLFilterImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/XMLReaderFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/LocatorImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/AttributeListImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/ParserAdapter$AttributeListAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/helpers/ParserAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/xml/sax/XMLFilter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Element.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Node.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Document.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/NodeList.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/events/EventTarget.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/events/DocumentEvent.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/events/EventListener.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/events/Event.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/events/EventException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/events/MutationEvent.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/DocumentType.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/CDATASection.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Text.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/CharacterData.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Entity.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/traversal/DocumentTraversal.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/traversal/NodeFilter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/traversal/NodeIterator.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/traversal/TreeWalker.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/ranges/DocumentRange.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/ranges/Range.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/ranges/RangeException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Attr.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/DOMException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/NamedNodeMap.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/DOMImplementation.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/DocumentFragment.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Comment.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/ProcessingInstruction.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/EntityReference.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/Notation.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLOptGroupElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLDocument.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLFormElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLCollection.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLQuoteElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLHRElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTableRowElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLScriptElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLAppletElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLMapElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLOptionElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLLegendElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLHeadingElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLIFrameElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLOListElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLButtonElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLDivElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLDOMImplementation.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLFieldSetElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLBRElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLPreElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTableCaptionElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLMetaElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLModElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLBaseFontElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTitleElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLDirectoryElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLStyleElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLImageElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLIsIndexElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTableCellElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTableColElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLLabelElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLFrameSetElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLSelectElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLParagraphElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLHtmlElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLLinkElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLFontElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLFrameElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLBodyElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTableSectionElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLAnchorElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLBaseElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLObjectElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLInputElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLMenuElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLLIElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLParamElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLUListElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLDListElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLAreaElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTextAreaElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLTableElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/w3c/dom/html/HTMLHeadElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/DocumentBuilderFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/DocumentBuilder.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/ParserConfigurationException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/FactoryConfigurationError.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/SAXParser.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/SAXParserFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/FactoryFinder$ConfigurationError.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate javax/xml/parsers/FactoryFinder.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar [WARNING] We have a duplicate org/apache/commons/logging/impl/NoOpLog.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar [WARNING] We have a duplicate org/apache/commons/logging/impl/SimpleLog$1.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar [WARNING] We have a duplicate org/apache/commons/logging/impl/SimpleLog.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar [WARNING] We have a duplicate org/apache/commons/logging/Log.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar [WARNING] We have a duplicate org/apache/commons/logging/LogConfigurationException.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar [WARNING] We have a duplicate org/apache/commons/logging/LogFactory.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar [WARNING] We have a duplicate org/apache/log4j/Appender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/AppenderSkeleton.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/AsyncAppender$DiscardSummary.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/AsyncAppender$Dispatcher.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/AsyncAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/BasicConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Category.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/CategoryKey.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/ConsoleAppender$SystemErrStream.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/ConsoleAppender$SystemOutStream.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/ConsoleAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/DailyRollingFileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/DefaultCategoryFactory.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/DefaultThrowableRenderer.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Dispatcher.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/EnhancedPatternLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/EnhancedThrowableRenderer.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/FileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/HTMLLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Hierarchy.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Layout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Level.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/LogMF.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/LogManager.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/LogSF.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/LogXF.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Logger.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/MDC.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/NDC$DiagnosticContext.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/NDC.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/NameValue.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/PatternLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/Priority.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/PropertyConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/PropertyWatchdog.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/ProvisionNode.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/RollingCalendar.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/RollingFileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/SimpleLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/SortedKeyEnumeration.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/TTCCLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/WriterAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/BridgePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/BridgePatternParser.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/CachedDateFormat.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/ClassNamePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/DatePatternConverter$DefaultZoneDateFormat.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/DatePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/FileDatePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/FileLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/FormattingInfo.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/FullLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/IntegerPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LevelPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LineLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LineSeparatorPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LiteralPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LogEvent.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LoggerPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/LoggingEventPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/MessagePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/MethodLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NDCPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$DropElementAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$MaxElementAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$NOPAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$PatternAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$PatternAbbreviatorFragment.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/NamePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/PatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/PatternParser$ReadOnlyMap.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/PatternParser.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/PropertiesPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/RelativeTimePatternConverter$CachedTimestamp.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/RelativeTimePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/SequenceNumberPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/ThreadPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/pattern/ThrowableInformationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/AppenderAttachable.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/Configurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/DefaultRepositorySelector.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/ErrorCode.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/ErrorHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/Filter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/HierarchyEventListener.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/LocationInfo.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/LoggerFactory.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/LoggerRepository.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/LoggingEvent.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/NOPLogger.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/NOPLoggerRepository.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/NullWriter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/OptionHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/RendererSupport.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/RepositorySelector.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/RootCategory.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/RootLogger.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/ThrowableInformation.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/ThrowableRenderer.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/ThrowableRendererSupport.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/TriggeringEventEvaluator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/spi/VectorWriter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/DenyAllFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/ExternallyRolledFileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/FallbackErrorHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/HUP.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/HUPNode.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/LevelMatchFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/LevelRangeFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/NullAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/ReloadingPropertyConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/Roller.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/varia/StringMatchFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$1.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$2.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$3.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$4.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$5.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$ParseAction.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/Log4jEntityResolver.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/SAXErrorHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/UnrecognizedElementHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/XMLLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/log4j/xml/XMLWatchdog.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar [WARNING] We have a duplicate org/apache/commons/collections/ArrayStack.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/Buffer.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/BufferUnderflowException.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$1.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$CollectionView$CollectionViewIterator.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$CollectionView.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$EntrySet.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$KeySet.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$Values.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [WARNING] We have a duplicate org/apache/commons/collections/FastHashMap.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar [INFO] [INFO] --- maven-failsafe-plugin:2.8.1:integration-test (integration-test) @ d1_cn_index_processor --- [INFO] Failsafe report directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/failsafe-reports ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.dataone.cn.indexer.solrhttp.SolrClientUpdateMechanismIT Tests run: 3, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 0.028 sec Running org.dataone.cn.indexer.solrhttp.SolrUpdatePerformanceIT [ERROR] 2019-10-15 22:57:15,165 [main] (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/node.properties [ERROR] 2019-10-15 22:57:15,173 [main] (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/index/d1client.properties Oct 15, 2019 10:57:15 PM com.hazelcast.config.ClasspathXmlConfig INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'. Oct 15, 2019 10:57:15 PM com.hazelcast.impl.AddressPicker INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1] Oct 15, 2019 10:57:15 PM com.hazelcast.impl.AddressPicker WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true. Oct 15, 2019 10:57:15 PM com.hazelcast.impl.AddressPicker INFO: Picked Address[127.0.0.1]:5720, using socket ServerSocket[addr=/0.0.0.0,localport=5720], bind any local is true Oct 15, 2019 10:57:15 PM com.hazelcast.system INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5720 Oct 15, 2019 10:57:15 PM com.hazelcast.system INFO: [127.0.0.1]:5720 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com Oct 15, 2019 10:57:15 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTING Oct 15, 2019 10:57:15 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5720 [DataONEBuildTest] Members [1] { Member [127.0.0.1]:5720 this } Oct 15, 2019 10:57:15 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTED [ INFO] 2019-10-15 22:57:16,893 [main] (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml [ WARN] 2019-10-15 22:57:17,193 [main] (org.dataone.cn.index.util.PerformanceLogger::19) Setting up PerformanceLogger: set to enabled? true [ WARN] 2019-10-15 22:57:18,190 [main] (org.dataone.cn.index.processor.IndexTaskProcessor::162) IndexTaskProcessor initialized with stated number of threads = 5 TestSolrUpdate-1571180238226 [ WARN] 2019-10-15 22:57:18,554 [main] (org.dataone.client.auth.CertificateManager::203) FileNotFound: No certificate installed in the default location: /tmp/x509up_u107 [ WARN] 2019-10-15 22:57:18,597 [main] (org.dataone.client.utils.HttpConnectionMonitorService$SingletonHolder::46) Starting monitor thread [ WARN] 2019-10-15 22:57:18,597 [Thread-5] (org.dataone.client.utils.HttpConnectionMonitorService:run:96) Starting monitoring... [ WARN] 2019-10-15 22:57:18,598 [main] (org.dataone.client.utils.HttpConnectionMonitorService:addMonitor:65) registering ConnectionManager... [ WARN] 2019-10-15 22:57:18,806 [main] (org.dataone.client.v2.itk.D1Client:bestAttemptRefreshNodeLocator:327) Could not refresh D1Client's NodeLocator, using previous one. org.dataone.service.exceptions.ServiceFailure: 404: 404: parser for deserializing HTML not written yet. Providing stripped-down html message body starting next line: HTTP Status 404 – Not FoundType Status ReportMessage /cn/v2/nodeDescription The origin server did not find a current representation for the target resource or is not willing to disclose that one exists.Apache Tomcat/8.5.39 (Ubuntu) at org.dataone.service.util.ExceptionHandler.deserializeHtmlAndThrowException(ExceptionHandler.java:465) at org.dataone.service.util.ExceptionHandler.deserializeAndThrowException(ExceptionHandler.java:403) at org.dataone.service.util.ExceptionHandler.deserializeAndThrowException(ExceptionHandler.java:344) at org.dataone.service.util.ExceptionHandler.filterErrors(ExceptionHandler.java:138) at org.dataone.client.rest.HttpMultipartRestClient.doGetRequest(HttpMultipartRestClient.java:343) at org.dataone.client.rest.HttpMultipartRestClient.doGetRequest(HttpMultipartRestClient.java:328) at org.dataone.client.v2.impl.MultipartCNode.listNodes(MultipartCNode.java:445) at org.dataone.client.v2.impl.SettingsContextNodeLocator.getNodeListFromSettingsCN(SettingsContextNodeLocator.java:116) at org.dataone.client.v2.impl.SettingsContextNodeLocator.(SettingsContextNodeLocator.java:83) at org.dataone.client.v2.itk.D1Client.bestAttemptRefreshNodeLocator(D1Client.java:322) at org.dataone.client.v2.itk.D1Client.getCN(D1Client.java:268) at org.dataone.client.v2.formats.ObjectFormatCache.refreshCache(ObjectFormatCache.java:195) at org.dataone.client.v2.formats.ObjectFormatCache.(ObjectFormatCache.java:96) at org.dataone.client.v2.formats.ObjectFormatCache.(ObjectFormatCache.java:57) at org.dataone.client.v2.formats.ObjectFormatCache$ObjectFormatCacheSingleton.(ObjectFormatCache.java:110) at org.dataone.client.v2.formats.ObjectFormatCache.getInstance(ObjectFormatCache.java:117) at org.dataone.cn.indexer.convert.FormatIdToFormatTypeConverter.convert(FormatIdToFormatTypeConverter.java:18) at org.dataone.cn.indexer.parser.SolrField.processNodeValue(SolrField.java:216) at org.dataone.cn.indexer.parser.SolrField.processField(SolrField.java:190) at org.dataone.cn.indexer.parser.SolrField.getFields(SolrField.java:119) at org.dataone.cn.indexer.parser.BaseXPathDocumentSubprocessor.parseDocument(BaseXPathDocumentSubprocessor.java:157) at org.dataone.cn.indexer.parser.BaseXPathDocumentSubprocessor.parseDocument(BaseXPathDocumentSubprocessor.java:134) at org.dataone.cn.indexer.AbstractStubMergingSubprocessor.processDocument(AbstractStubMergingSubprocessor.java:77) at org.dataone.cn.indexer.SolrIndexServiceV2.processInsertTask(SolrIndexServiceV2.java:283) at org.dataone.cn.indexer.solrhttp.SolrUpdatePerformanceIT.update(SolrUpdatePerformanceIT.java:152) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74) at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83) at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174) at org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:119) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:101) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.maven.surefire.booter.ProviderFactory$ClassLoaderProxy.invoke(ProviderFactory.java:103) at com.sun.proxy.$Proxy0.invoke(Unknown Source) at org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:150) at org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcess(SurefireStarter.java:91) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:69) [ WARN] 2019-10-15 22:57:18,817 [main] (org.apache.http.client.protocol.ResponseProcessCookies:processCookies:121) Cookie rejected [JSESSIONID="4A97760AC73E021CAB6911A00AAF5307", version:0, domain:cn-dev.test.dataone.org, path:/metacat, expiry:null] Illegal path attribute "/metacat". Path of origin: "/cn/v2/formats" [ INFO] 2019-10-15 22:57:18,873 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,873 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@52a7928a [ INFO] 2019-10-15 22:57:18,873 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238226 [ WARN] 2019-10-15 22:57:18,873 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-0 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238873 [ INFO] 2019-10-15 22:57:18,877 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,878 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1a1f79ce [ INFO] 2019-10-15 22:57:18,878 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238873 [ WARN] 2019-10-15 22:57:18,878 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-1 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238878 [ INFO] 2019-10-15 22:57:18,882 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,882 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@601d6622 [ INFO] 2019-10-15 22:57:18,882 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238878 [ WARN] 2019-10-15 22:57:18,882 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-2 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238882 [ INFO] 2019-10-15 22:57:18,885 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,886 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@391d1e33 [ INFO] 2019-10-15 22:57:18,886 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238882 [ WARN] 2019-10-15 22:57:18,886 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-3 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238886 [ INFO] 2019-10-15 22:57:18,890 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,890 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3fde2209 [ INFO] 2019-10-15 22:57:18,890 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238886 [ WARN] 2019-10-15 22:57:18,890 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-4 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238890 [ INFO] 2019-10-15 22:57:18,894 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,894 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1e226bcd [ INFO] 2019-10-15 22:57:18,894 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238890 [ WARN] 2019-10-15 22:57:18,894 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-5 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238894 [ INFO] 2019-10-15 22:57:18,897 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,897 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6f38f084 [ INFO] 2019-10-15 22:57:18,898 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238894 [ WARN] 2019-10-15 22:57:18,898 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-6 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238898 [ INFO] 2019-10-15 22:57:18,901 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,901 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5968800d [ INFO] 2019-10-15 22:57:18,901 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238898 [ WARN] 2019-10-15 22:57:18,901 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-7 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238901 [ INFO] 2019-10-15 22:57:18,904 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,904 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6ddd1c51 [ INFO] 2019-10-15 22:57:18,904 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238901 [ WARN] 2019-10-15 22:57:18,904 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-8 is null, so skipping processing with content subprocessors TestSolrUpdate-1571180238904 [ INFO] 2019-10-15 22:57:18,907 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1 [ INFO] 2019-10-15 22:57:18,907 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94) main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6544899b [ INFO] 2019-10-15 22:57:18,907 [main] (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96) main document id from parseDocuments: TestSolrUpdate-1571180238904 [ WARN] 2019-10-15 22:57:18,907 [main] (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1571180238225-9 is null, so skipping processing with content subprocessors ======================== iterations: 10 avg. time ms: 53 ======================== =========================== Queries: 0 Total time: 0 Avg time: 0 =========================== Updates: 0 Total time: 0 Avg time: 0 commit within (ms): 250 Oct 15, 2019 10:57:20 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTTING_DOWN Oct 15, 2019 10:57:21 PM com.hazelcast.initializer INFO: [127.0.0.1]:5720 [DataONEBuildTest] Destroying node initializer. Oct 15, 2019 10:57:21 PM com.hazelcast.impl.Node INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Shutdown is completed in 740 ms. Oct 15, 2019 10:57:21 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTDOWN Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 6.768 sec Running org.dataone.cn.indexer.solrhttp.SolrJClientIT Oct 15, 2019 10:57:21 PM com.hazelcast.config.ClasspathXmlConfig INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'. Oct 15, 2019 10:57:21 PM com.hazelcast.impl.AddressPicker INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1] Oct 15, 2019 10:57:21 PM com.hazelcast.impl.AddressPicker INFO: Picked Address[127.0.0.1]:5720, using socket ServerSocket[addr=/0.0.0.0,localport=5720], bind any local is true Oct 15, 2019 10:57:21 PM com.hazelcast.system INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5720 Oct 15, 2019 10:57:21 PM com.hazelcast.system INFO: [127.0.0.1]:5720 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com Oct 15, 2019 10:57:21 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTING Oct 15, 2019 10:57:21 PM com.hazelcast.impl.TcpIpJoiner INFO: [127.0.0.1]:5720 [DataONEBuildTest] Members [1] { Member [127.0.0.1]:5720 this } Oct 15, 2019 10:57:21 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTED Oct 15, 2019 10:57:23 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTTING_DOWN Oct 15, 2019 10:57:24 PM com.hazelcast.initializer INFO: [127.0.0.1]:5720 [DataONEBuildTest] Destroying node initializer. Oct 15, 2019 10:57:24 PM com.hazelcast.impl.Node INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Shutdown is completed in 725 ms. Oct 15, 2019 10:57:24 PM com.hazelcast.impl.LifecycleServiceImpl INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTDOWN Tests run: 6, Failures: 0, Errors: 0, Skipped: 5, Time elapsed: 2.768 sec Results : Tests run: 12, Failures: 0, Errors: 0, Skipped: 9 [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! [JENKINS] Recording test results [INFO] [INFO] --- maven-failsafe-plugin:2.8.1:verify (verify) @ d1_cn_index_processor --- [INFO] Failsafe report directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/failsafe-reports [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! [JENKINS] Recording test results [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ d1_cn_index_processor --- [INFO] Installing /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/d1_cn_index_processor-2.4.0-SNAPSHOT.jar to /var/lib/jenkins/.m2/repository/org/dataone/d1_cn_index_processor/2.4.0-SNAPSHOT/d1_cn_index_processor-2.4.0-SNAPSHOT.jar [INFO] Installing /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/pom.xml to /var/lib/jenkins/.m2/repository/org/dataone/d1_cn_index_processor/2.4.0-SNAPSHOT/d1_cn_index_processor-2.4.0-SNAPSHOT.pom Notifying upstream projects of job completion Join notifier requires a CauseAction [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 02:24 min [INFO] Finished at: 2019-10-15T22:57:25Z [INFO] ------------------------------------------------------------------------ Waiting for Jenkins to finish collecting data