SuccessConsole Output

Established TCP socket on 36407
<===[JENKINS REMOTING CAPACITY]===>channel started
Executing Maven:  -B -f /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/pom.xml clean install
[INFO] Scanning for projects...
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for org.dataone:d1_cn_index_processor:jar:2.4.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.version' for com.mycila.maven-license-plugin:maven-license-plugin is missing. @ line 343, column 15
[WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 335, column 15
[WARNING] 'build.plugins.plugin.version' for org.codehaus.mojo:buildnumber-maven-plugin is missing. @ line 350, column 21
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING] 
[INFO] 
[INFO] -----------------< org.dataone:d1_cn_index_processor >------------------
[INFO] Building d1_cn_index_processor 2.4.0-SNAPSHOT
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ d1_cn_index_processor ---
[INFO] 
[INFO] --- buildnumber-maven-plugin:1.4:create (default) @ d1_cn_index_processor ---
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace
[INFO] Storing buildNumber: null at timestamp: 1573532742426
[INFO] Executing: /bin/sh -c cd '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace' && 'svn' '--non-interactive' 'info'
[INFO] Working directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace
[INFO] Storing buildScmBranch: UNKNOWN_BRANCH
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ d1_cn_index_processor ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 70 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ d1_cn_index_processor ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 95 source files to /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/classes
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/resourcemap/ResourceMapDataSource.java: Some input files use or override a deprecated API.
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/resourcemap/ResourceMapDataSource.java: Recompile with -Xlint:deprecation for details.
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/XMLNamespaceConfig.java: Some input files use unchecked or unsafe operations.
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/main/java/org/dataone/cn/indexer/XMLNamespaceConfig.java: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ d1_cn_index_processor ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 317 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ d1_cn_index_processor ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 51 source files to /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/test-classes
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/InvalidXmlCharTest.java: Some input files use or override a deprecated API.
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/InvalidXmlCharTest.java: Recompile with -Xlint:deprecation for details.
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/DataONESolrJettyTestBase.java: Some input files use unchecked or unsafe operations.
[WARNING] /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/src/test/java/org/dataone/cn/index/DataONESolrJettyTestBase.java: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-surefire-plugin:2.20:test (default-test) @ d1_cn_index_processor ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.dataone.cn.index.SolrIndexDeleteTest
Creating dataDir: /tmp/org.dataone.cn.index.SolrIndexDeleteTest_FFBF373BD2808889-001/init-core-data-001
ERROR IN SolrLogFormatter! original message:Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.config.ClasspathXmlConfig.<init>(ClasspathXmlConfig.java:35)
	at com.hazelcast.config.ClasspathXmlConfig.<init>(ClasspathXmlConfig.java:30)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:30)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.impl.AddressPicker.log(AddressPicker.java:330)
	at com.hazelcast.impl.AddressPicker.getInterfaces(AddressPicker.java:208)
	at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:131)
	at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:51)
	at com.hazelcast.impl.Node.<init>(Node.java:144)
	at com.hazelcast.impl.FactoryImpl.<init>(FactoryImpl.java:386)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true.
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.impl.AddressPicker.log(AddressPicker.java:330)
	at com.hazelcast.impl.AddressPicker.pickLoopbackAddress(AddressPicker.java:262)
	at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:134)
	at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:51)
	at com.hazelcast.impl.Node.<init>(Node.java:144)
	at com.hazelcast.impl.FactoryImpl.<init>(FactoryImpl.java:386)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:Picked Address[127.0.0.1]:5720, using socket ServerSocket[addr=/0:0:0:0:0:0:0:0,localport=5720], bind any local is true
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.impl.AddressPicker.log(AddressPicker.java:330)
	at com.hazelcast.impl.AddressPicker.pickAddress(AddressPicker.java:105)
	at com.hazelcast.impl.Node.<init>(Node.java:144)
	at com.hazelcast.impl.FactoryImpl.<init>(FactoryImpl.java:386)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5720
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.base.DefaultNodeInitializer.printNodeInfo(DefaultNodeInitializer.java:50)
	at com.hazelcast.impl.Node.<init>(Node.java:181)
	at com.hazelcast.impl.FactoryImpl.<init>(FactoryImpl.java:386)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.base.DefaultNodeInitializer.printNodeInfo(DefaultNodeInitializer.java:52)
	at com.hazelcast.impl.Node.<init>(Node.java:181)
	at com.hazelcast.impl.FactoryImpl.<init>(FactoryImpl.java:386)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTING
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:63)
	at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:59)
	at com.hazelcast.impl.FactoryImpl.<init>(FactoryImpl.java:387)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:133)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5720 this
}

	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.AbstractJoiner$1.process(AbstractJoiner.java:118)
	at com.hazelcast.cluster.ClusterService$1.process(ClusterService.java:127)
	at com.hazelcast.cluster.ClusterService.processProcessable(ClusterService.java:190)
	at com.hazelcast.cluster.ClusterService.dequeueProcessables(ClusterService.java:256)
	at com.hazelcast.cluster.ClusterService.run(ClusterService.java:201)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTED
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:63)
	at com.hazelcast.impl.LifecycleServiceImpl.fireLifecycleEvent(LifecycleServiceImpl.java:59)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:175)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:119)
	at com.hazelcast.impl.FactoryImpl.newHazelcastInstanceProxy(FactoryImpl.java:104)
	at com.hazelcast.core.Hazelcast.newHazelcastInstance(Hazelcast.java:507)
	at org.dataone.cn.index.HazelcastClientFactoryTest.startHazelcast(HazelcastClientFactoryTest.java:38)
	at org.dataone.cn.index.HazelcastClientFactoryTest.setUp(HazelcastClientFactoryTest.java:54)
	at org.dataone.cn.index.SolrIndexDeleteTest.init(SolrIndexDeleteTest.java:941)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:776)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
[ INFO] 2019-11-12 04:25:48,225 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ERROR] 2019-11-12 04:25:48,666 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/node.properties
[ERROR] 2019-11-12 04:25:48,672 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/index/d1client.properties
[ WARN] 2019-11-12 04:25:48,682 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.util.PerformanceLogger:<init>:19) Setting up PerformanceLogger: set to enabled? true
[ WARN] 2019-11-12 04:25:49,529 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:25:49,855 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:25:50,269 [coreLoadExecutor-5-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:25:50,272 [coreLoadExecutor-5-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:25:50,362 [coreLoadExecutor-5-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:25:50,488 [coreLoadExecutor-5-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ WARN] 2019-11-12 04:25:50,643 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:25:50,643 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:25:50,644 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=60}
[ WARN] 2019-11-12 04:25:50,644 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:25:50,644 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ WARN] 2019-11-12 04:25:51,133 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.client.auth.CertificateManager:<init>:203) FileNotFound: No certificate installed in the default location: /tmp/x509up_u107
[ WARN] 2019-11-12 04:25:51,175 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.client.utils.HttpConnectionMonitorService$SingletonHolder:<clinit>:46) Starting monitor thread
[ WARN] 2019-11-12 04:25:51,175 [Thread-15]  (org.dataone.client.utils.HttpConnectionMonitorService:run:96) Starting monitoring...
[ WARN] 2019-11-12 04:25:51,175 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.client.utils.HttpConnectionMonitorService:addMonitor:65) registering ConnectionManager...
[ WARN] 2019-11-12 04:25:51,447 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.apache.http.client.protocol.ResponseProcessCookies:processCookies:121) Cookie rejected [JSESSIONID="55D5D89FDDB2F0373D51242975C98C0E", version:0, domain:cn-dev.test.dataone.org, path:/metacat, expiry:null] Illegal path attribute "/metacat". Path of origin: "/cn/v2/formats"
[ INFO] 2019-11-12 04:25:51,500 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:25:51,500 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4538bcc7
[ INFO] 2019-11-12 04:25:51,500 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:25:51,544 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:25:51,545 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@24f56a38
[ INFO] 2019-11-12 04:25:51,547 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:25:51,549 [TEST-SolrIndexDeleteTest.testHttpServiceSolrDelete-seed#[FFBF373BD2808889]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[WARNING] Tests run: 10, Failures: 0, Errors: 0, Skipped: 9, Time elapsed: 7.805 s - in org.dataone.cn.index.SolrIndexDeleteTest
[INFO] Running org.dataone.cn.index.IndexTaskProcessingIntegrationTest
[ INFO] 2019-11-12 04:25:54,035 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:25:54,163 [main]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.283 s - in org.dataone.cn.index.IndexTaskProcessingIntegrationTest
[INFO] Running org.dataone.cn.index.SolrRangeQueryTest
Creating dataDir: /tmp/org.dataone.cn.index.SolrRangeQueryTest_A7BFB071DC31BC23-001/init-core-data-001
[ INFO] 2019-11-12 04:25:54,508 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:25:54,626 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:25:54,657 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:25:54,869 [coreLoadExecutor-15-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:25:54,872 [coreLoadExecutor-15-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:25:54,893 [coreLoadExecutor-15-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:25:54,904 [coreLoadExecutor-15-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
***** before delete *********************************
[ WARN] 2019-11-12 04:25:54,915 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:25:54,916 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:25:54,916 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2}
[ WARN] 2019-11-12 04:25:54,916 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:25:54,917 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
***** between **********************************
[ INFO] 2019-11-12 04:25:54,923 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:25:54,924 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3bd8c634
[ INFO] 2019-11-12 04:25:54,924 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:25:54,948 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:25:54,948 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@35fa44b0
[ INFO] 2019-11-12 04:25:54,949 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:25:54,949 [TEST-SolrRangeQueryTest.testTwoFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
***** after load *********************************
***** two field range query 
[ INFO] 2019-11-12 04:25:57,245 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:25:57,349 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:25:57,364 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
***** before delete *********************************
[ WARN] 2019-11-12 04:25:57,389 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:25:57,389 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:25:57,389 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=16}
[ WARN] 2019-11-12 04:25:57,390 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:25:57,390 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
***** between **********************************
[ INFO] 2019-11-12 04:25:57,394 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:25:57,394 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4e99f6ce
[ INFO] 2019-11-12 04:25:57,395 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:25:57,411 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:25:57,412 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@798c3521
[ INFO] 2019-11-12 04:25:57,412 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:25:57,413 [TEST-SolrRangeQueryTest.testSimpleRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
***** after load *********************************
***** simpleRangeQuery 
[ INFO] 2019-11-12 04:25:59,859 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:00,026 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:00,041 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
***** before delete *********************************
[ WARN] 2019-11-12 04:26:00,054 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:00,055 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:00,055 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:00,055 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:00,056 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
***** between **********************************
[ INFO] 2019-11-12 04:26:00,060 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:00,061 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@40685fe5
[ INFO] 2019-11-12 04:26:00,061 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:00,073 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:00,074 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2cc89080
[ INFO] 2019-11-12 04:26:00,074 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:00,074 [TEST-SolrRangeQueryTest.testFourFieldRangeQuery-seed#[A7BFB071DC31BC23]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
***** after load *********************************
***** four field range query
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.936 s - in org.dataone.cn.index.SolrRangeQueryTest
[INFO] Running org.dataone.cn.index.SolrFieldDublinCoreOAITest
[ INFO] 2019-11-12 04:26:02,589 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:02,696 [main]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.611 s - in org.dataone.cn.index.SolrFieldDublinCoreOAITest
[INFO] Running org.dataone.cn.index.SolrIndexFieldTest
Creating dataDir: /tmp/org.dataone.cn.index.SolrIndexFieldTest_C0AFD2B3CC26AD0C-001/init-core-data-001
[ INFO] 2019-11-12 04:26:02,966 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:03,051 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:03,088 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:03,311 [coreLoadExecutor-25-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:03,312 [coreLoadExecutor-25-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:03,336 [coreLoadExecutor-25-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:03,365 [coreLoadExecutor-25-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ INFO] 2019-11-12 04:26:03,374 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:03,376 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@e652f9
[ INFO] 2019-11-12 04:26:03,378 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: 68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml
[ INFO] 2019-11-12 04:26:03,402 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:03,403 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5ae1ecfd
[ INFO] 2019-11-12 04:26:03,406 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:03,408 [TEST-SolrIndexFieldTest.testComplexSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Comparing value for field abstract
Doc Value:  The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.
Solr Value: The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.

Comparing value for field beginDate
Doc Value:  Tue Jul 01 00:00:00 GMT 2014
Solr Value: Tue Jul 01 00:00:00 GMT 2014

Comparing value for field contactOrganization
Doc Value:  [Earth Data Analysis Center]
Solr Value: [Earth Data Analysis Center]

Comparing value for field eastBoundCoord
Doc Value:  -106.05
Solr Value: -106.05

Comparing value for field westBoundCoord
Doc Value:  -106.05
Solr Value: -106.05

Comparing value for field northBoundCoord
Doc Value:  36.083332
Solr Value: 36.083332

Comparing value for field southBoundCoord
Doc Value:  36.083332
Solr Value: 36.083332

Comparing value for field endDate
Doc Value:  Fri Aug 01 00:00:00 GMT 2014
Solr Value: Fri Aug 01 00:00:00 GMT 2014

Comparing value for field keywords
Doc Value:  [climatologyMeteorologyAtmosphere, precipitation, air temperature, relative humidity, solar radiation, barometric pressure, snow water content, snow depth, soil moisture, soil temperature, EPSCoR Tract 3, EPSG:4269, EPSG:7019, NM, Rio Arriba, Alcalde]
Solr Value: [climatologyMeteorologyAtmosphere, precipitation, air temperature, relative humidity, solar radiation, barometric pressure, snow water content, snow depth, soil moisture, soil temperature, EPSCoR Tract 3, EPSG:4269, EPSG:7019, NM, Rio Arriba, Alcalde]

Comparing value for field geoform
Doc Value:  vector digital data
Solr Value: vector digital data

Comparing value for field origin
Doc Value:  [Water and Climate Center, National Resources Conservation Service]
Solr Value: [Water and Climate Center, National Resources Conservation Service]

Comparing value for field placeKey
Doc Value:  [EPSG:4269, EPSG:7019, NM, Rio Arriba, Alcalde]
Solr Value: [EPSG:4269, EPSG:7019, NM, Rio Arriba, Alcalde]

Comparing value for field pubDate
Doc Value:  Mon Aug 04 00:00:00 GMT 2014
Solr Value: Mon Aug 04 00:00:00 GMT 2014

Comparing value for field purpose
Doc Value:  The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.
Solr Value: The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.

Comparing value for field title
Doc Value:  SCAN Alcalde, NM (July, 2014)
Solr Value: SCAN Alcalde, NM (July, 2014)

Comparing value for field webUrl
Doc Value:  [http://nmepscor.org/dataportal, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wms?SERVICE=wms&REQUEST=GetCapabilities&VERSION=1.1.1, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wfs?SERVICE=wfs&REQUEST=GetCapabilities&VERSION=1.0.0, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.shp, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.original.zip, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.gml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.kml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.geojson, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.json, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.csv, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.xls, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.html, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.html, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WMS.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WFS.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19110.xml]
Solr Value: [http://nmepscor.org/dataportal, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wms?SERVICE=wms&REQUEST=GetCapabilities&VERSION=1.1.1, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wfs?SERVICE=wfs&REQUEST=GetCapabilities&VERSION=1.0.0, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.shp, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.original.zip, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.gml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.kml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.geojson, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.json, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.csv, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.xls, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.html, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.html, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WMS.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WFS.xml, http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19110.xml]

Comparing value for field fileID
Doc Value:  https://cn.dataone.org/cn/v2/resolve/68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml
Solr Value: https://cn.dataone.org/cn/v2/resolve/68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml

Comparing value for field text
Doc Value:  Water and Climate Center, National Resources Conservation Service  20140804  SCAN Alcalde, NM (July, 2014)  vector digital data  http://nmepscor.org/dataportal  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wms?SERVICE=wms&REQUEST=GetCapabilities&VERSION=1.1.1  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wfs?SERVICE=wfs&REQUEST=GetCapabilities&VERSION=1.0.0  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.shp  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.original.zip  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.gml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.kml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.geojson  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.json  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.csv  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.xls  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.html  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.html  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WMS.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WFS.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19110.xml     The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.  The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.  Funding for this SCAN station was provided by EPSCoR. These data are 
                made available through New Mexico's Experimental Program to
                Stimulate Competitive Research (NM EPSCoR) data portal. The portal provides access
                to data generated by and of use to the EPSCoR community including researchers,
                educators, and policy leaders, as well as the general public.
                http://www.wcc.nrcs.usda.gov/scan/      20140701  010000  20140801    Ground Condition    Complete  Unknown     -106.05  -106.05  36.0833333333  36.0833333333      ISO 19115 Topic Categories  climatologyMeteorologyAtmosphere    http://www.wcc.nrcs.usda.gov/scan/  precipitation  air temperature  relative humidity  solar radiation  barometric pressure  snow water content  snow depth  soil moisture  soil temperature    EPSCoR Project  EPSCoR Tract 3    Spatial Reference System Identifiers  EPSG:4269  EPSG:7019    http://www.wcc.nrcs.usda.gov/scan/  NM  Rio Arriba  Alcalde    None  None     Tony Tolsdorf  National Water and Climate Center   Monitoring Branch Leader   Mailing address  1201 NE Lloyd Blvd., Suite 802  Portland  OR  97232-1274   503-414-3006  tony.tolsdorf@por.usda.gov    Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcCatalog
            9.3.1.3000    Not provided  Not provided    Create the shapefiles using the provided lat/lon from the NRCS, one per
                    station using an Arcpy script. Download the CSV data for each station, each
                    year, each month from the station's Reporting Since date.  20140804     Earth Data Analysis Center   Clearinghouse Manager   mailing address  MSC01 1110  1 University of New Mexico  Albuquerque  NM  87131-0001  USA   505-277-3622 ext. 230  505-277-3614  clearinghouse@edac.unm.edu  8am to 5pm MT, M-F        Rio Arriba County, NM  Vector    Point  744        0.00000001  0.00000001  Decimal degrees    North American Datum of 1983  Geodetic Reference System 80  6378137.000000  298.257222        SCAN  Soil, Climate, Analysis, Network (SCAN)  http://www.wcc.nrcs.usda.gov/scan/    FID  Internal feature number.  ESRI   Sequential unique whole numbers that are automatically generated.     Shape  Feature geometry.  ESRI   Coordinates defining the features.     site_id  NRCS Internal Site Number  NRCS   NRCS Internal Site Number     date  Date  NRCS   Date     time  Time (Pacific Standard Time)  NRCS   Time (Pacific Standard Time)     preci_1  Precipitation Accumulated  NRCS    Unknown  Unknown  inches      tobsi_1  Air Temperature Observed  NRCS    Unknown  Unknown  degrees Celsius      tmaxh_1  Air Temperature Maximum  NRCS    Unknown  Unknown  degrees Celsius      tminh_1  Air Temperature Minimum  NRCS    Unknown  Unknown  degrees Celsius      tavgh_1  Air Temperature Average  NRCS    Unknown  Unknown  degrees Celsius      prcph_1  Precipitation Increment  NRCS    Unknown  Unknown  inches      smsi_1_2l  Soil Moisture Percent; Sensor Height -2in  NRCS    Unknown  Unknown  percent      smsi_1_4l  Soil Moisture Percent; Sensor Height -4in  NRCS    Unknown  Unknown  percent      smsi_1_8l  Soil Moisture Percent; Sensor Height -8in  NRCS    Unknown  Unknown  percent      smsi_1_20l  Soil Moisture Percent; Sensor Height -20in  NRCS    Unknown  Unknown  percent      smsi_1_40l  Soil Moisture Percent; Sensor Height -40in  NRCS    Unknown  Unknown  percent      stoi_1_2  Soil Temperature Observed; Sensor Height -2in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_4  Soil Temperature Observed; Sensor Height -4in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_8  Soil Temperature Observed; Sensor Height -8in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_20  Soil Temperature Observed; Sensor Height -20in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_40  Soil Temperature Observed; Sensor Height -40in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_2  Soil Temperature Observed; Sensor Height -2in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_4  Soil Temperature Observed; Sensor Height -4in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_8  Soil Temperature Observed; Sensor Height -8in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_20  Soil Temperature Observed; Sensor Height -20in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_40  Soil Temperature Observed; Sensor Height -40in  NRCS    Unknown  Unknown  degrees Celsius      sali_1_2  Salinity; Sensor Height -2in  NRCS    Unknown  Unknown  gram      sali_1_4  Salinity; Sensor Height -4in  NRCS    Unknown  Unknown  gram      sali_1_8  Salinity; Sensor Height -8in  NRCS    Unknown  Unknown  gram      sali_1_20  Salinity; Sensor Height -20in  NRCS    Unknown  Unknown  gram      sali_1_40  Salinity; Sensor Height -40in  NRCS    Unknown  Unknown  gram      rdci_1_2  Real Dielectric Constant; Sensor Height -2in  NRCS    Unknown  Unknown  unitless      rdci_1_4  Real Dielectric Constant; Sensor Height -4in  NRCS    Unknown  Unknown  unitless      rdci_1_8  Real Dielectric Constant; Sensor Height -8in  NRCS    Unknown  Unknown  unitless      rdci_1_20  Real Dielectric Constant; Sensor Height -20in  NRCS    Unknown  Unknown  unitless      rdci_1_40  Real Dielectric Constant; Sensor Height -40in  NRCS    Unknown  Unknown  unitless      batti_1  Battery Voltage  NRCS    Unknown  Unknown  volt      batti_2  Battery Voltage  NRCS    Unknown  Unknown  volt      wdirvh_1  Wind Direction Average  NRCS    Unknown  Unknown  degrees      wspdxh_1mx  Wind Speed Maximum  NRCS    Unknown  Unknown  mph      wspdvh1  Wind Speed Average  NRCS    Unknown  Unknown  mph      rhumi_1  Relative Humidity; Vaisala  NRCS    Unknown  Unknown  percent      presi_1  Barometric Pressure; Sampled 10 minutes  NRCS    Unknown  Unknown  inches      sradvh_1  Solar Radiation Average  NRCS    Unknown  Unknown  watt      dptph_1  Dew Point Temperature  NRCS    Unknown  Unknown  degrees Celsius      pvpvh_1  Vapor Pressure  NRCS    Unknown  Unknown  kPa      rhumnh_1  Relative Humidity Minimum  NRCS    Unknown  Unknown  percent      rhumxh_1  Relative Humidity Maximum  NRCS    Unknown  Unknown  percent           Earth Data Analysis Center   Clearinghouse Manager   mailing and physical address  MSC01 1110  1 University of New Mexico  Albuquerque  NM  87131-0001  USA   505-277-3622 ext. 230  505-277-3614  clearinghouse@edac.unm.edu  0800 - 1700 MT, M-F -7 hours GMT    Downloadable Data  The material on this site is made available as a public service. Maps and data are to be used for reference purposes only and the Earth Data Analysis Center (EDAC), New Mexico EPSCoR (NM EPSCoR), Energize New Mexico and The University of New Mexico are not responsible for any inaccuracies herein contained. No responsibility is assumed for damages or other liabilities due to the accuracy, availability, use or misuse of the information herein provided. Unless otherwise indicated in the documentation (metadata) for individual data sets, information on this site is public domain and may be copied without permission; citation of the source is appreciated.     ZIP  1       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.original.zip    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      ESRI Shapefile (shp)  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.shp    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      GML  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.gml    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      KML  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.kml    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      GeoJSON  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.geojson    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      JSON  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.json    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      Comma Separated Values (csv)  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.csv    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      MS Excel format (xls)  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.xls    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu   Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu  Adequate computer capability is the only technical prerequisite for viewing data in digital form.    20150423     Earth Data Analysis Center   Clearinghouse Manager   mailing and physical address  MSC01 1110  1 University of New Mexico  Albuquerque  NM  87131-0001  USA   505-277-3622 ext. 230  505-277-3614  clearinghouse@edac.unm.edu  0800 - 1700 MT, M-F -7 hours GMT    FGDC Content Standards for Digital Geospatial Metadata  FGDC-STD-001-1998  local time 68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml
Solr Value: Water and Climate Center, National Resources Conservation Service  20140804  SCAN Alcalde, NM (July, 2014)  vector digital data  http://nmepscor.org/dataportal  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wms?SERVICE=wms&REQUEST=GetCapabilities&VERSION=1.1.1  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/services/ogc/wfs?SERVICE=wfs&REQUEST=GetCapabilities&VERSION=1.0.0  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.shp  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.original.zip  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.gml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.kml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.geojson  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.json  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.csv  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.xls  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/FGDC-STD-001-1998.html  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19115:2003.html  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WMS.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19119:WFS.xml  http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/metadata/ISO-19110.xml     The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.  The Soil Climate Analysis Network (SCAN) is a comprehensive, nationwide soil
                moisture and climate information system designed to provide data to support natural
                resource assessments and conservation activities. Administered by the United States
                Department of Agriculture Natural Resources Conservation Service (NRCS) through the
                National Water and Climate Center (NWCC), in cooperation with the NRCS National Soil
                Survey Center, the system focuses on agricultural areas of the U.S. monitoring soil
                temperature and soil moisture content at several depths, soil water level, air
                temperature, relative humidity, solar radiation, wind, precipitation, barometric
                pressure, and more.  Funding for this SCAN station was provided by EPSCoR. These data are 
                made available through New Mexico's Experimental Program to
                Stimulate Competitive Research (NM EPSCoR) data portal. The portal provides access
                to data generated by and of use to the EPSCoR community including researchers,
                educators, and policy leaders, as well as the general public.
                http://www.wcc.nrcs.usda.gov/scan/      20140701  010000  20140801    Ground Condition    Complete  Unknown     -106.05  -106.05  36.0833333333  36.0833333333      ISO 19115 Topic Categories  climatologyMeteorologyAtmosphere    http://www.wcc.nrcs.usda.gov/scan/  precipitation  air temperature  relative humidity  solar radiation  barometric pressure  snow water content  snow depth  soil moisture  soil temperature    EPSCoR Project  EPSCoR Tract 3    Spatial Reference System Identifiers  EPSG:4269  EPSG:7019    http://www.wcc.nrcs.usda.gov/scan/  NM  Rio Arriba  Alcalde    None  None     Tony Tolsdorf  National Water and Climate Center   Monitoring Branch Leader   Mailing address  1201 NE Lloyd Blvd., Suite 802  Portland  OR  97232-1274   503-414-3006  tony.tolsdorf@por.usda.gov    Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcCatalog
            9.3.1.3000    Not provided  Not provided    Create the shapefiles using the provided lat/lon from the NRCS, one per
                    station using an Arcpy script. Download the CSV data for each station, each
                    year, each month from the station's Reporting Since date.  20140804     Earth Data Analysis Center   Clearinghouse Manager   mailing address  MSC01 1110  1 University of New Mexico  Albuquerque  NM  87131-0001  USA   505-277-3622 ext. 230  505-277-3614  clearinghouse@edac.unm.edu  8am to 5pm MT, M-F        Rio Arriba County, NM  Vector    Point  744        0.00000001  0.00000001  Decimal degrees    North American Datum of 1983  Geodetic Reference System 80  6378137.000000  298.257222        SCAN  Soil, Climate, Analysis, Network (SCAN)  http://www.wcc.nrcs.usda.gov/scan/    FID  Internal feature number.  ESRI   Sequential unique whole numbers that are automatically generated.     Shape  Feature geometry.  ESRI   Coordinates defining the features.     site_id  NRCS Internal Site Number  NRCS   NRCS Internal Site Number     date  Date  NRCS   Date     time  Time (Pacific Standard Time)  NRCS   Time (Pacific Standard Time)     preci_1  Precipitation Accumulated  NRCS    Unknown  Unknown  inches      tobsi_1  Air Temperature Observed  NRCS    Unknown  Unknown  degrees Celsius      tmaxh_1  Air Temperature Maximum  NRCS    Unknown  Unknown  degrees Celsius      tminh_1  Air Temperature Minimum  NRCS    Unknown  Unknown  degrees Celsius      tavgh_1  Air Temperature Average  NRCS    Unknown  Unknown  degrees Celsius      prcph_1  Precipitation Increment  NRCS    Unknown  Unknown  inches      smsi_1_2l  Soil Moisture Percent; Sensor Height -2in  NRCS    Unknown  Unknown  percent      smsi_1_4l  Soil Moisture Percent; Sensor Height -4in  NRCS    Unknown  Unknown  percent      smsi_1_8l  Soil Moisture Percent; Sensor Height -8in  NRCS    Unknown  Unknown  percent      smsi_1_20l  Soil Moisture Percent; Sensor Height -20in  NRCS    Unknown  Unknown  percent      smsi_1_40l  Soil Moisture Percent; Sensor Height -40in  NRCS    Unknown  Unknown  percent      stoi_1_2  Soil Temperature Observed; Sensor Height -2in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_4  Soil Temperature Observed; Sensor Height -4in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_8  Soil Temperature Observed; Sensor Height -8in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_20  Soil Temperature Observed; Sensor Height -20in  NRCS    Unknown  Unknown  degrees Celsius      stoi_1_40  Soil Temperature Observed; Sensor Height -40in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_2  Soil Temperature Observed; Sensor Height -2in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_4  Soil Temperature Observed; Sensor Height -4in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_8  Soil Temperature Observed; Sensor Height -8in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_20  Soil Temperature Observed; Sensor Height -20in  NRCS    Unknown  Unknown  degrees Celsius      stoi_2_40  Soil Temperature Observed; Sensor Height -40in  NRCS    Unknown  Unknown  degrees Celsius      sali_1_2  Salinity; Sensor Height -2in  NRCS    Unknown  Unknown  gram      sali_1_4  Salinity; Sensor Height -4in  NRCS    Unknown  Unknown  gram      sali_1_8  Salinity; Sensor Height -8in  NRCS    Unknown  Unknown  gram      sali_1_20  Salinity; Sensor Height -20in  NRCS    Unknown  Unknown  gram      sali_1_40  Salinity; Sensor Height -40in  NRCS    Unknown  Unknown  gram      rdci_1_2  Real Dielectric Constant; Sensor Height -2in  NRCS    Unknown  Unknown  unitless      rdci_1_4  Real Dielectric Constant; Sensor Height -4in  NRCS    Unknown  Unknown  unitless      rdci_1_8  Real Dielectric Constant; Sensor Height -8in  NRCS    Unknown  Unknown  unitless      rdci_1_20  Real Dielectric Constant; Sensor Height -20in  NRCS    Unknown  Unknown  unitless      rdci_1_40  Real Dielectric Constant; Sensor Height -40in  NRCS    Unknown  Unknown  unitless      batti_1  Battery Voltage  NRCS    Unknown  Unknown  volt      batti_2  Battery Voltage  NRCS    Unknown  Unknown  volt      wdirvh_1  Wind Direction Average  NRCS    Unknown  Unknown  degrees      wspdxh_1mx  Wind Speed Maximum  NRCS    Unknown  Unknown  mph      wspdvh1  Wind Speed Average  NRCS    Unknown  Unknown  mph      rhumi_1  Relative Humidity; Vaisala  NRCS    Unknown  Unknown  percent      presi_1  Barometric Pressure; Sampled 10 minutes  NRCS    Unknown  Unknown  inches      sradvh_1  Solar Radiation Average  NRCS    Unknown  Unknown  watt      dptph_1  Dew Point Temperature  NRCS    Unknown  Unknown  degrees Celsius      pvpvh_1  Vapor Pressure  NRCS    Unknown  Unknown  kPa      rhumnh_1  Relative Humidity Minimum  NRCS    Unknown  Unknown  percent      rhumxh_1  Relative Humidity Maximum  NRCS    Unknown  Unknown  percent           Earth Data Analysis Center   Clearinghouse Manager   mailing and physical address  MSC01 1110  1 University of New Mexico  Albuquerque  NM  87131-0001  USA   505-277-3622 ext. 230  505-277-3614  clearinghouse@edac.unm.edu  0800 - 1700 MT, M-F -7 hours GMT    Downloadable Data  The material on this site is made available as a public service. Maps and data are to be used for reference purposes only and the Earth Data Analysis Center (EDAC), New Mexico EPSCoR (NM EPSCoR), Energize New Mexico and The University of New Mexico are not responsible for any inaccuracies herein contained. No responsibility is assumed for damages or other liabilities due to the accuracy, availability, use or misuse of the information herein provided. Unless otherwise indicated in the documentation (metadata) for individual data sets, information on this site is public domain and may be copied without permission; citation of the source is appreciated.     ZIP  1       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.original.zip    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      ESRI Shapefile (shp)  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.shp    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      GML  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.gml    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      KML  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.kml    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      GeoJSON  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.geojson    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      JSON  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.json    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      Comma Separated Values (csv)  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.csv    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu      MS Excel format (xls)  5       http://gstore.unm.edu/apps/energize/datasets/68e96cf6-fb14-42aa-bbea-6da546ccb507/scan_201407_2172.derived.xls    Download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico at http://nmepscor.org.     None. The files are available to download from New Mexico EPSCoR (NM EPSCoR), Energize New Mexico (http://nmepscor.org).  Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu   Contact Earth Data Analysis Center at clearinghouse@edac.unm.edu  Adequate computer capability is the only technical prerequisite for viewing data in digital form.    20150423     Earth Data Analysis Center   Clearinghouse Manager   mailing and physical address  MSC01 1110  1 University of New Mexico  Albuquerque  NM  87131-0001  USA   505-277-3622 ext. 230  505-277-3614  clearinghouse@edac.unm.edu  0800 - 1700 MT, M-F -7 hours GMT    FGDC Content Standards for Digital Geospatial Metadata  FGDC-STD-001-1998  local time 68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml

Comparing value for field presentationCat
Doc Value:  vector digital data
Solr Value: vector digital data

Comparing value for field author
Doc Value:  Water and Climate Center, National Resources Conservation Service
Solr Value: Water and Climate Center, National Resources Conservation Service

Comparing value for field authorSurName
Doc Value:  Water and Climate Center, National Resources Conservation Service
Solr Value: Water and Climate Center, National Resources Conservation Service

Comparing value for field authorSurNameSort
Doc Value:  Water and Climate Center, National Resources Conservation Service
Solr Value: Water and Climate Center, National Resources Conservation Service

Comparing value for field investigator
Doc Value:  [Water and Climate Center, National Resources Conservation Service]
Solr Value: [Water and Climate Center, National Resources Conservation Service]

Comparing value for field attributeName
Doc Value:  [FID, Shape, site_id, date, time, preci_1, tobsi_1, tmaxh_1, tminh_1, tavgh_1, prcph_1, smsi_1_2l, smsi_1_4l, smsi_1_8l, smsi_1_20l, smsi_1_40l, stoi_1_2, stoi_1_4, stoi_1_8, stoi_1_20, stoi_1_40, stoi_2_2, stoi_2_4, stoi_2_8, stoi_2_20, stoi_2_40, sali_1_2, sali_1_4, sali_1_8, sali_1_20, sali_1_40, rdci_1_2, rdci_1_4, rdci_1_8, rdci_1_20, rdci_1_40, batti_1, batti_2, wdirvh_1, wspdxh_1mx, wspdvh1, rhumi_1, presi_1, sradvh_1, dptph_1, pvpvh_1, rhumnh_1, rhumxh_1]
Solr Value: [FID, Shape, site_id, date, time, preci_1, tobsi_1, tmaxh_1, tminh_1, tavgh_1, prcph_1, smsi_1_2l, smsi_1_4l, smsi_1_8l, smsi_1_20l, smsi_1_40l, stoi_1_2, stoi_1_4, stoi_1_8, stoi_1_20, stoi_1_40, stoi_2_2, stoi_2_4, stoi_2_8, stoi_2_20, stoi_2_40, sali_1_2, sali_1_4, sali_1_8, sali_1_20, sali_1_40, rdci_1_2, rdci_1_4, rdci_1_8, rdci_1_20, rdci_1_40, batti_1, batti_2, wdirvh_1, wspdxh_1mx, wspdvh1, rhumi_1, presi_1, sradvh_1, dptph_1, pvpvh_1, rhumnh_1, rhumxh_1]

Comparing value for field attributeDescription
Doc Value:  [Internal feature number., Feature geometry., NRCS Internal Site Number, Date, Time (Pacific Standard Time), Precipitation Accumulated, Air Temperature Observed, Air Temperature Maximum, Air Temperature Minimum, Air Temperature Average, Precipitation Increment, Soil Moisture Percent; Sensor Height -2in, Soil Moisture Percent; Sensor Height -4in, Soil Moisture Percent; Sensor Height -8in, Soil Moisture Percent; Sensor Height -20in, Soil Moisture Percent; Sensor Height -40in, Soil Temperature Observed; Sensor Height -2in, Soil Temperature Observed; Sensor Height -4in, Soil Temperature Observed; Sensor Height -8in, Soil Temperature Observed; Sensor Height -20in, Soil Temperature Observed; Sensor Height -40in, Soil Temperature Observed; Sensor Height -2in, Soil Temperature Observed; Sensor Height -4in, Soil Temperature Observed; Sensor Height -8in, Soil Temperature Observed; Sensor Height -20in, Soil Temperature Observed; Sensor Height -40in, Salinity; Sensor Height -2in, Salinity; Sensor Height -4in, Salinity; Sensor Height -8in, Salinity; Sensor Height -20in, Salinity; Sensor Height -40in, Real Dielectric Constant; Sensor Height -2in, Real Dielectric Constant; Sensor Height -4in, Real Dielectric Constant; Sensor Height -8in, Real Dielectric Constant; Sensor Height -20in, Real Dielectric Constant; Sensor Height -40in, Battery Voltage, Battery Voltage, Wind Direction Average, Wind Speed Maximum, Wind Speed Average, Relative Humidity; Vaisala, Barometric Pressure; Sampled 10 minutes, Solar Radiation Average, Dew Point Temperature, Vapor Pressure, Relative Humidity Minimum, Relative Humidity Maximum]
Solr Value: [Internal feature number., Feature geometry., NRCS Internal Site Number, Date, Time (Pacific Standard Time), Precipitation Accumulated, Air Temperature Observed, Air Temperature Maximum, Air Temperature Minimum, Air Temperature Average, Precipitation Increment, Soil Moisture Percent; Sensor Height -2in, Soil Moisture Percent; Sensor Height -4in, Soil Moisture Percent; Sensor Height -8in, Soil Moisture Percent; Sensor Height -20in, Soil Moisture Percent; Sensor Height -40in, Soil Temperature Observed; Sensor Height -2in, Soil Temperature Observed; Sensor Height -4in, Soil Temperature Observed; Sensor Height -8in, Soil Temperature Observed; Sensor Height -20in, Soil Temperature Observed; Sensor Height -40in, Soil Temperature Observed; Sensor Height -2in, Soil Temperature Observed; Sensor Height -4in, Soil Temperature Observed; Sensor Height -8in, Soil Temperature Observed; Sensor Height -20in, Soil Temperature Observed; Sensor Height -40in, Salinity; Sensor Height -2in, Salinity; Sensor Height -4in, Salinity; Sensor Height -8in, Salinity; Sensor Height -20in, Salinity; Sensor Height -40in, Real Dielectric Constant; Sensor Height -2in, Real Dielectric Constant; Sensor Height -4in, Real Dielectric Constant; Sensor Height -8in, Real Dielectric Constant; Sensor Height -20in, Real Dielectric Constant; Sensor Height -40in, Battery Voltage, Battery Voltage, Wind Direction Average, Wind Speed Maximum, Wind Speed Average, Relative Humidity; Vaisala, Barometric Pressure; Sampled 10 minutes, Solar Radiation Average, Dew Point Temperature, Vapor Pressure, Relative Humidity Minimum, Relative Humidity Maximum]

Comparing value for field attributeUnit
Doc Value:  [inches, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, inches, percent, percent, percent, percent, percent, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, gram, gram, gram, gram, gram, unitless, unitless, unitless, unitless, unitless, volt, volt, degrees, mph, mph, percent, inches, watt, degrees Celsius, kPa, percent, percent]
Solr Value: [inches, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, inches, percent, percent, percent, percent, percent, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, degrees Celsius, gram, gram, gram, gram, gram, unitless, unitless, unitless, unitless, unitless, volt, volt, degrees, mph, mph, percent, inches, watt, degrees Celsius, kPa, percent, percent]

Comparing value for field attribute
Doc Value:  [FID  Internal feature number., Shape  Feature geometry., site_id  NRCS Internal Site Number, date  Date, time  Time (Pacific Standard Time), preci_1  Precipitation Accumulated inches, tobsi_1  Air Temperature Observed degrees Celsius, tmaxh_1  Air Temperature Maximum degrees Celsius, tminh_1  Air Temperature Minimum degrees Celsius, tavgh_1  Air Temperature Average degrees Celsius, prcph_1  Precipitation Increment inches, smsi_1_2l  Soil Moisture Percent; Sensor Height -2in percent, smsi_1_4l  Soil Moisture Percent; Sensor Height -4in percent, smsi_1_8l  Soil Moisture Percent; Sensor Height -8in percent, smsi_1_20l  Soil Moisture Percent; Sensor Height -20in percent, smsi_1_40l  Soil Moisture Percent; Sensor Height -40in percent, stoi_1_2  Soil Temperature Observed; Sensor Height -2in degrees Celsius, stoi_1_4  Soil Temperature Observed; Sensor Height -4in degrees Celsius, stoi_1_8  Soil Temperature Observed; Sensor Height -8in degrees Celsius, stoi_1_20  Soil Temperature Observed; Sensor Height -20in degrees Celsius, stoi_1_40  Soil Temperature Observed; Sensor Height -40in degrees Celsius, stoi_2_2  Soil Temperature Observed; Sensor Height -2in degrees Celsius, stoi_2_4  Soil Temperature Observed; Sensor Height -4in degrees Celsius, stoi_2_8  Soil Temperature Observed; Sensor Height -8in degrees Celsius, stoi_2_20  Soil Temperature Observed; Sensor Height -20in degrees Celsius, stoi_2_40  Soil Temperature Observed; Sensor Height -40in degrees Celsius, sali_1_2  Salinity; Sensor Height -2in gram, sali_1_4  Salinity; Sensor Height -4in gram, sali_1_8  Salinity; Sensor Height -8in gram, sali_1_20  Salinity; Sensor Height -20in gram, sali_1_40  Salinity; Sensor Height -40in gram, rdci_1_2  Real Dielectric Constant; Sensor Height -2in unitless, rdci_1_4  Real Dielectric Constant; Sensor Height -4in unitless, rdci_1_8  Real Dielectric Constant; Sensor Height -8in unitless, rdci_1_20  Real Dielectric Constant; Sensor Height -20in unitless, rdci_1_40  Real Dielectric Constant; Sensor Height -40in unitless, batti_1  Battery Voltage volt, batti_2  Battery Voltage volt, wdirvh_1  Wind Direction Average degrees, wspdxh_1mx  Wind Speed Maximum mph, wspdvh1  Wind Speed Average mph, rhumi_1  Relative Humidity; Vaisala percent, presi_1  Barometric Pressure; Sampled 10 minutes inches, sradvh_1  Solar Radiation Average watt, dptph_1  Dew Point Temperature degrees Celsius, pvpvh_1  Vapor Pressure kPa, rhumnh_1  Relative Humidity Minimum percent, rhumxh_1  Relative Humidity Maximum percent]
Solr Value: [FID  Internal feature number., Shape  Feature geometry., site_id  NRCS Internal Site Number, date  Date, time  Time (Pacific Standard Time), preci_1  Precipitation Accumulated inches, tobsi_1  Air Temperature Observed degrees Celsius, tmaxh_1  Air Temperature Maximum degrees Celsius, tminh_1  Air Temperature Minimum degrees Celsius, tavgh_1  Air Temperature Average degrees Celsius, prcph_1  Precipitation Increment inches, smsi_1_2l  Soil Moisture Percent; Sensor Height -2in percent, smsi_1_4l  Soil Moisture Percent; Sensor Height -4in percent, smsi_1_8l  Soil Moisture Percent; Sensor Height -8in percent, smsi_1_20l  Soil Moisture Percent; Sensor Height -20in percent, smsi_1_40l  Soil Moisture Percent; Sensor Height -40in percent, stoi_1_2  Soil Temperature Observed; Sensor Height -2in degrees Celsius, stoi_1_4  Soil Temperature Observed; Sensor Height -4in degrees Celsius, stoi_1_8  Soil Temperature Observed; Sensor Height -8in degrees Celsius, stoi_1_20  Soil Temperature Observed; Sensor Height -20in degrees Celsius, stoi_1_40  Soil Temperature Observed; Sensor Height -40in degrees Celsius, stoi_2_2  Soil Temperature Observed; Sensor Height -2in degrees Celsius, stoi_2_4  Soil Temperature Observed; Sensor Height -4in degrees Celsius, stoi_2_8  Soil Temperature Observed; Sensor Height -8in degrees Celsius, stoi_2_20  Soil Temperature Observed; Sensor Height -20in degrees Celsius, stoi_2_40  Soil Temperature Observed; Sensor Height -40in degrees Celsius, sali_1_2  Salinity; Sensor Height -2in gram, sali_1_4  Salinity; Sensor Height -4in gram, sali_1_8  Salinity; Sensor Height -8in gram, sali_1_20  Salinity; Sensor Height -20in gram, sali_1_40  Salinity; Sensor Height -40in gram, rdci_1_2  Real Dielectric Constant; Sensor Height -2in unitless, rdci_1_4  Real Dielectric Constant; Sensor Height -4in unitless, rdci_1_8  Real Dielectric Constant; Sensor Height -8in unitless, rdci_1_20  Real Dielectric Constant; Sensor Height -20in unitless, rdci_1_40  Real Dielectric Constant; Sensor Height -40in unitless, batti_1  Battery Voltage volt, batti_2  Battery Voltage volt, wdirvh_1  Wind Direction Average degrees, wspdxh_1mx  Wind Speed Maximum mph, wspdvh1  Wind Speed Average mph, rhumi_1  Relative Humidity; Vaisala percent, presi_1  Barometric Pressure; Sampled 10 minutes inches, sradvh_1  Solar Radiation Average watt, dptph_1  Dew Point Temperature degrees Celsius, pvpvh_1  Vapor Pressure kPa, rhumnh_1  Relative Humidity Minimum percent, rhumxh_1  Relative Humidity Maximum percent]

Comparing value for field geohash_1
Doc Value:  [9]
Solr Value: [9]

Comparing value for field geohash_2
Doc Value:  [9w]
Solr Value: [9w]

Comparing value for field geohash_3
Doc Value:  [9wk]
Solr Value: [9wk]

Comparing value for field geohash_4
Doc Value:  [9wkt]
Solr Value: [9wkt]

Comparing value for field geohash_5
Doc Value:  [9wkt6]
Solr Value: [9wkt6]

Comparing value for field geohash_6
Doc Value:  [9wkt6b]
Solr Value: [9wkt6b]

Comparing value for field geohash_7
Doc Value:  [9wkt6bb]
Solr Value: [9wkt6bb]

Comparing value for field geohash_8
Doc Value:  [9wkt6bb8]
Solr Value: [9wkt6bb8]

Comparing value for field geohash_9
Doc Value:  [9wkt6bb8x]
Solr Value: [9wkt6bb8x]

Comparing value for field dataUrl
Doc Value:  https://cn.dataone.org/cn/v2/resolve/68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml
Solr Value: https://cn.dataone.org/cn/v2/resolve/68e96cf6-fb14-42aa-bbea-6da546ccb507-scan_201407_2172.xml

[ INFO] 2019-11-12 04:26:03,834 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:03,919 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:03,935 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ INFO] 2019-11-12 04:26:03,939 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:03,940 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6039446b
[ INFO] 2019-11-12 04:26:03,941 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns
[ INFO] 2019-11-12 04:26:03,948 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:03,948 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6e4d1f61
[ INFO] 2019-11-12 04:26:03,950 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:03,950 [TEST-SolrIndexFieldTest.testSystemMetadataAndFgdcScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Comparing value for field abstract
Doc Value:  Adult prawn species, size, sex, reproductive stage, moult stage, and parasites were measured at 20 stations in Albatross Bay, Gulf of Carpentaria. Sampling was carried out monthly between 1986 and 1992. This metadata record is sourced from 'MarLIN', the CSIRO Marine Laboratories Information Network.
Solr Value: Adult prawn species, size, sex, reproductive stage, moult stage, and parasites were measured at 20 stations in Albatross Bay, Gulf of Carpentaria. Sampling was carried out monthly between 1986 and 1992. This metadata record is sourced from 'MarLIN', the CSIRO Marine Laboratories Information Network.

Comparing value for field beginDate
Doc Value:  Sat Mar 01 00:00:00 GMT 1986
Solr Value: Sat Mar 01 00:00:00 GMT 1986

Comparing value for field contactOrganization
Doc Value:  [CSIRO Division of Marine Research-Hobart]
Solr Value: [CSIRO Division of Marine Research-Hobart]

Comparing value for field eastBoundCoord
Doc Value:  142.0
Solr Value: 142.0

Comparing value for field westBoundCoord
Doc Value:  141.5
Solr Value: 141.5

Comparing value for field northBoundCoord
Doc Value:  -12.5
Solr Value: -12.5

Comparing value for field southBoundCoord
Doc Value:  -13.0
Solr Value: -13.0

Comparing value for field endDate
Doc Value:  Wed Apr 01 00:00:00 GMT 1992
Solr Value: Wed Apr 01 00:00:00 GMT 1992

Comparing value for field keywords
Doc Value:  [BIOMASS|LANDSAT TM|LANDSAT-5, adult prawn data, size, sex, reproductive stage, moult stage, parasites, Australlia, Gulf of Carpentaria, Albatross Bay]
Solr Value: [BIOMASS|LANDSAT TM|LANDSAT-5, adult prawn data, size, sex, reproductive stage, moult stage, parasites, Australlia, Gulf of Carpentaria, Albatross Bay]

Comparing value for field geoform
Doc Value:  maps data
Solr Value: maps data

Comparing value for field kingdom
Doc Value:  [Animalia]
Solr Value: [Animalia]

Comparing value for field order
Doc Value:  [Decapoda]
Solr Value: [Decapoda]

Comparing value for field phylum
Doc Value:  [Arthropoda, Test]
Solr Value: [Arthropoda, Test]

Comparing value for field class
Doc Value:  [Malacostraca]
Solr Value: [Malacostraca]

Comparing value for field origin
Doc Value:  [CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)]
Solr Value: [CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)]

Comparing value for field placeKey
Doc Value:  [Australlia, Gulf of Carpentaria, Albatross Bay]
Solr Value: [Australlia, Gulf of Carpentaria, Albatross Bay]

Comparing value for field pubDate
Doc Value:  Fri Jan 01 00:00:00 GMT 1993
Solr Value: Fri Jan 01 00:00:00 GMT 1993

Comparing value for field purpose
Doc Value:  The purpose of the dataset is to provide information about adult prawn species in Albatross Bay, Gulf of Carpentaria.
Solr Value: The purpose of the dataset is to provide information about adult prawn species in Albatross Bay, Gulf of Carpentaria.

Comparing value for field title
Doc Value:  Albatross Bay Adult Prawn Data 1986-1992
Solr Value: Albatross Bay Adult Prawn Data 1986-1992

Comparing value for field fileID
Doc Value:  https://cn.dataone.org/cn/v2/resolve/www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns
Solr Value: https://cn.dataone.org/cn/v2/resolve/www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns

Comparing value for field text
Doc Value:  http://www.nbii.gov/metadata/mdata/CSIRO/csiro_d_abayadultprawns.xml     CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)  1993  Albatross Bay Adult Prawn Data 1986-1992  maps data   Australia  CSIRO Division of Marine Research      Adult prawn species, size, sex, reproductive stage, moult stage, and parasites were measured at 20 stations in Albatross Bay, Gulf of Carpentaria. Sampling was carried out monthly between 1986 and 1992. This metadata record is sourced from 'MarLIN', the CSIRO Marine Laboratories Information Network.  The purpose of the dataset is to provide information about adult prawn species in Albatross Bay, Gulf of Carpentaria.  Information was obtained from http://www.marine.csiro.au/marine/mcdd/data/CSIRODMR/CSIRODMR_datasets.html.

The previous online linkage was determined to be broken in October 2010 and moved here.  The previous online linkage was:
http://www.marine.csiro.au/marine/mcdd/data/CSIRODMR/Albatross_Bay_Adult_Prawn_Data_1986_1992.HTML      19860301  19920401    ground condition    Complete  None planned    Australia, Gulf of Carpentaria   141.5  142  -12.5  -13      ISO 19115 Topic Category  none    Parameter_Sensor_Source  BIOMASS|LANDSAT TM|LANDSAT-5    none  adult prawn data  size  sex  reproductive stage  moult stage  parasites  NONE    none  Australlia  Gulf of Carpentaria  Albatross Bay      none  prawns  shrimps  crustaceans       Agencies listed below  2002  Integrated Taxonomic Information System  Database   Washington, D.C.  U.S. Department of Agriculture   Department of Commerce, National Oceanic and Atmospheric Administration (NOAA),  Department of Interior (DOI), Geological Survey (USGS), Environmental Protection Agency (EPA), Department of Agriculture (USDA), Agriculture Research Service (ARS) Natural Resources Conservation Service (NRCS) Smithsonian Institution National Museum of Natural History (NMNH).  http://www.itis.usda.gov/       Kingdom  Animalia  animals   Phylum  Arthropoda  arthropods   Division  Test  arthropods   Subphylum  Crustacea  crustaceans   Class  Malacostraca   Subclass  Eumalacostraca   Superorder  Eucarida   Order  Decapoda  crabs  crayfishes  lobsters  prawns  shrimp           Release with the permission of the custodian.  None     Peter Crocos  CSIRO Division of Marine Research-Cleveland    mailing address  P.O. Box 120  Cleveland  Queensland  4163  Australia   unknown  unknown  peter.crocos@csiro.au      CSIRO Division of Marine Research  Unknown  Albatross Bay Chlorophyll Data 1986-1992  unknown   Australia  CSIRO Division of Marine Research       Stephen Blaber, David Brewer, John Salini, J. Kerr  Unknown  Albatross Bay Fish Data 1986-1988  unknown   Australia  CSIRO Division of Marine Research       Stever Blaber, CSIRO Division of Marine Research  Unknown  Albatross Bay Nearshore Fish Study 1991-1992  unknown   Australia  CSIRO Division of Marine Research       CSIRO Division of Marine Research  Unknown  Albatross Bay Nutrient Data 1992  unknown   Australia  CSIRO Division of Marine Research       Chris Jackson, CSIRO Division of Marine Research  Unknown  Albatross Bay Phytoplankton Data 1986-1992  unknown   Queensland, Australia  CSIRO Division of Marine Research       CSIRO Division of Marine Research  Unknown  Albatross Bay Prawn Larval Data  unknown   Queensland, Australia  CSIRO Division of Marine Research       CSIRO Division of Marine Research  Unknown  Albatross Bay Primary Productivity  unknown   Queensland, Australia  CSIRO Division of Marine Research       not applicable  Twenty stations were sampled.    Field  unknown    unknown  Unknown      Point     Entity - Adult Prawn in Albatross Bay, Gulf of Carpentaria, Australia; Attributes - size, sex, reproductive stage, moult stage, parasites  unknown        Tony Rees  CSIRO Division of Marine Research-Hobart    mailing address  Hobart  Australia   unknown  unknown  Tony.Rees@csiro.au    You accept all risks and responsibility for losses, damages, costs and other consequences resulting directly or indirectly from using this site and andinformation or material available from it. To the maximum permitted by law, CSIRO excludes all liability to any person arising directly or indirectly from using this site and any information or material available from it.  Please contact distributor.    19980710  20020930  20030930     Cheryl Solomon  Science Systems and Applications, Inc.   Metadata specialist   mailing and physical address  10210 Greenbelt Road, Suite 500  Lanham  Maryland  20706   301 867-2080  301-867-2149.  solomon@gcmd.nasa.gov    FGDC Biological Data Profile of the Content Standard for Digital Geospatial Metadata  FGDC-STD-001.1-1999 www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns
Solr Value: http://www.nbii.gov/metadata/mdata/CSIRO/csiro_d_abayadultprawns.xml     CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)  1993  Albatross Bay Adult Prawn Data 1986-1992  maps data   Australia  CSIRO Division of Marine Research      Adult prawn species, size, sex, reproductive stage, moult stage, and parasites were measured at 20 stations in Albatross Bay, Gulf of Carpentaria. Sampling was carried out monthly between 1986 and 1992. This metadata record is sourced from 'MarLIN', the CSIRO Marine Laboratories Information Network.  The purpose of the dataset is to provide information about adult prawn species in Albatross Bay, Gulf of Carpentaria.  Information was obtained from http://www.marine.csiro.au/marine/mcdd/data/CSIRODMR/CSIRODMR_datasets.html.

The previous online linkage was determined to be broken in October 2010 and moved here.  The previous online linkage was:
http://www.marine.csiro.au/marine/mcdd/data/CSIRODMR/Albatross_Bay_Adult_Prawn_Data_1986_1992.HTML      19860301  19920401    ground condition    Complete  None planned    Australia, Gulf of Carpentaria   141.5  142  -12.5  -13      ISO 19115 Topic Category  none    Parameter_Sensor_Source  BIOMASS|LANDSAT TM|LANDSAT-5    none  adult prawn data  size  sex  reproductive stage  moult stage  parasites  NONE    none  Australlia  Gulf of Carpentaria  Albatross Bay      none  prawns  shrimps  crustaceans       Agencies listed below  2002  Integrated Taxonomic Information System  Database   Washington, D.C.  U.S. Department of Agriculture   Department of Commerce, National Oceanic and Atmospheric Administration (NOAA),  Department of Interior (DOI), Geological Survey (USGS), Environmental Protection Agency (EPA), Department of Agriculture (USDA), Agriculture Research Service (ARS) Natural Resources Conservation Service (NRCS) Smithsonian Institution National Museum of Natural History (NMNH).  http://www.itis.usda.gov/       Kingdom  Animalia  animals   Phylum  Arthropoda  arthropods   Division  Test  arthropods   Subphylum  Crustacea  crustaceans   Class  Malacostraca   Subclass  Eumalacostraca   Superorder  Eucarida   Order  Decapoda  crabs  crayfishes  lobsters  prawns  shrimp           Release with the permission of the custodian.  None     Peter Crocos  CSIRO Division of Marine Research-Cleveland    mailing address  P.O. Box 120  Cleveland  Queensland  4163  Australia   unknown  unknown  peter.crocos@csiro.au      CSIRO Division of Marine Research  Unknown  Albatross Bay Chlorophyll Data 1986-1992  unknown   Australia  CSIRO Division of Marine Research       Stephen Blaber, David Brewer, John Salini, J. Kerr  Unknown  Albatross Bay Fish Data 1986-1988  unknown   Australia  CSIRO Division of Marine Research       Stever Blaber, CSIRO Division of Marine Research  Unknown  Albatross Bay Nearshore Fish Study 1991-1992  unknown   Australia  CSIRO Division of Marine Research       CSIRO Division of Marine Research  Unknown  Albatross Bay Nutrient Data 1992  unknown   Australia  CSIRO Division of Marine Research       Chris Jackson, CSIRO Division of Marine Research  Unknown  Albatross Bay Phytoplankton Data 1986-1992  unknown   Queensland, Australia  CSIRO Division of Marine Research       CSIRO Division of Marine Research  Unknown  Albatross Bay Prawn Larval Data  unknown   Queensland, Australia  CSIRO Division of Marine Research       CSIRO Division of Marine Research  Unknown  Albatross Bay Primary Productivity  unknown   Queensland, Australia  CSIRO Division of Marine Research       not applicable  Twenty stations were sampled.    Field  unknown    unknown  Unknown      Point     Entity - Adult Prawn in Albatross Bay, Gulf of Carpentaria, Australia; Attributes - size, sex, reproductive stage, moult stage, parasites  unknown        Tony Rees  CSIRO Division of Marine Research-Hobart    mailing address  Hobart  Australia   unknown  unknown  Tony.Rees@csiro.au    You accept all risks and responsibility for losses, damages, costs and other consequences resulting directly or indirectly from using this site and andinformation or material available from it. To the maximum permitted by law, CSIRO excludes all liability to any person arising directly or indirectly from using this site and any information or material available from it.  Please contact distributor.    19980710  20020930  20030930     Cheryl Solomon  Science Systems and Applications, Inc.   Metadata specialist   mailing and physical address  10210 Greenbelt Road, Suite 500  Lanham  Maryland  20706   301 867-2080  301-867-2149.  solomon@gcmd.nasa.gov    FGDC Biological Data Profile of the Content Standard for Digital Geospatial Metadata  FGDC-STD-001.1-1999 www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns

Comparing value for field presentationCat
Doc Value:  maps data
Solr Value: maps data

Comparing value for field author
Doc Value:  CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)
Solr Value: CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)

Comparing value for field authorSurName
Doc Value:  CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)
Solr Value: CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)

Comparing value for field authorSurNameSort
Doc Value:  CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)
Solr Value: CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)

Comparing value for field investigator
Doc Value:  [CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)]
Solr Value: [CSIRO Marine Research (formerly CSIRO Division of Fisheries/Fisheries Research)]

Comparing value for field site
Doc Value:  [Australia, Gulf of Carpentaria]
Solr Value: [Australia, Gulf of Carpentaria]

Comparing value for field geohash_1
Doc Value:  [r]
Solr Value: [r]

Comparing value for field geohash_2
Doc Value:  [rj]
Solr Value: [rj]

Comparing value for field geohash_3
Doc Value:  [rjs]
Solr Value: [rjs]

Comparing value for field geohash_4
Doc Value:  [rjsz]
Solr Value: [rjsz]

Comparing value for field geohash_5
Doc Value:  [rjsz3]
Solr Value: [rjsz3]

Comparing value for field geohash_6
Doc Value:  [rjsz3w]
Solr Value: [rjsz3w]

Comparing value for field geohash_7
Doc Value:  [rjsz3wg]
Solr Value: [rjsz3wg]

Comparing value for field geohash_8
Doc Value:  [rjsz3wgj]
Solr Value: [rjsz3wgj]

Comparing value for field geohash_9
Doc Value:  [rjsz3wgjy]
Solr Value: [rjsz3wgjy]

Comparing value for field id
Doc Value:  www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns
Solr Value: www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns

Comparing value for field identifier
Doc Value:  www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns
Solr Value: www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns

Comparing value for field formatId
Doc Value:  FGDC-STD-001.1-1999
Solr Value: FGDC-STD-001.1-1999

Comparing value for field formatType
Doc Value:  METADATA
Solr Value: METADATA

Comparing value for field size
Doc Value:  9008
Solr Value: 9008

Comparing value for field checksum
Doc Value:  86bc6417ef29b6fbd279160699044e5e
Solr Value: 86bc6417ef29b6fbd279160699044e5e

Comparing value for field submitter
Doc Value:  CN=Dave Vieglais T799,O=Google,C=US,DC=cilogon,DC=org
Solr Value: CN=Dave Vieglais T799,O=Google,C=US,DC=cilogon,DC=org

Comparing value for field checksumAlgorithm
Doc Value:  MD5
Solr Value: MD5

Comparing value for field rightsHolder
Doc Value:  CN=Dave Vieglais T799,O=Google,C=US,DC=cilogon,DC=org
Solr Value: CN=Dave Vieglais T799,O=Google,C=US,DC=cilogon,DC=org

Comparing value for field replicationAllowed
Doc Value:  true
Solr Value: true

Comparing value for field numberReplicas
Doc Value:  3
Solr Value: 3

Comparing value for field obsoletes
Doc Value:  csiro_c_abayadultprawns
Solr Value: csiro_c_abayadultprawns

Comparing value for field obsoletedBy
Doc Value:  csiro_e_abayadultprawns
Solr Value: csiro_e_abayadultprawns

Comparing value for field dateUploaded
Doc Value:  Thu Mar 22 13:55:48 GMT 2012
Solr Value: Thu Mar 22 13:55:48 GMT 2012

Comparing value for field dateModified
Doc Value:  Thu Mar 22 13:55:48 GMT 2012
Solr Value: Thu Mar 22 13:55:48 GMT 2012

Comparing value for field datasource
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field authoritativeMN
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field readPermission
Doc Value:  [public]
Solr Value: [public]

Comparing value for field writePermission
Doc Value:  [CN=Dave Vieglais T799,O=Google,C=US,DC=cilogon,DC=org]
Solr Value: [CN=Dave Vieglais T799,O=Google,C=US,DC=cilogon,DC=org]

Comparing value for field isPublic
Doc Value:  true
Solr Value: true

Comparing value for field dataUrl
Doc Value:  https://cn.dataone.org/cn/v2/resolve/www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns
Solr Value: https://cn.dataone.org/cn/v2/resolve/www.nbii.gov_metadata_mdata_CSIRO_csiro_d_abayadultprawns

[ INFO] 2019-11-12 04:26:04,221 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:04,296 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:04,306 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ INFO] 2019-11-12 04:26:04,310 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:04,311 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@20ccf143
[ INFO] 2019-11-12 04:26:04,312 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:04,326 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:04,328 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@55d0248b
[ INFO] 2019-11-12 04:26:04,330 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:04,332 [TEST-SolrIndexFieldTest.testSystemMetadataAndEml210ScienceData-seed#[C0AFD2B3CC26AD0C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Comparing value for field abstract
Doc Value:  This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.
Solr Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.

Comparing value for field keywords
Doc Value:  [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]
Solr Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]

Comparing value for field title
Doc Value:  Augrabies falls National Park census data.
Solr Value: Augrabies falls National Park census data.

Comparing value for field southBoundCoord
Doc Value:  26.0
Solr Value: 26.0

Comparing value for field northBoundCoord
Doc Value:  26.0
Solr Value: 26.0

Comparing value for field westBoundCoord
Doc Value:  -120.31121
Solr Value: -120.31121

Comparing value for field eastBoundCoord
Doc Value:  -120.31121
Solr Value: -120.31121

Comparing value for field site
Doc Value:  [Agulhas falls national Park]
Solr Value: [Agulhas falls national Park]

Comparing value for field beginDate
Doc Value:  Thu Jan 01 00:00:00 GMT 1998
Solr Value: Thu Jan 01 00:00:00 GMT 1998

Comparing value for field endDate
Doc Value:  Fri Feb 13 00:00:00 GMT 2004
Solr Value: Fri Feb 13 00:00:00 GMT 2004

Comparing value for field author
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorSurName
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorSurNameSort
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorLastName
Doc Value:  [SANParks, Garcia, Freeman]
Solr Value: [SANParks, Garcia, Freeman]

Comparing value for field investigator
Doc Value:  [SANParks, Garcia, Freeman]
Solr Value: [SANParks, Garcia, Freeman]

Comparing value for field origin
Doc Value:  [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]
Solr Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]

Comparing value for field contactOrganization
Doc Value:  [SANParks, The Awesome Store]
Solr Value: [SANParks, The Awesome Store]

Comparing value for field genus
Doc Value:  [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]
Solr Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]

Comparing value for field species
Doc Value:  [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]
Solr Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]

Comparing value for field scientificName
Doc Value:  [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]
Solr Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]

Comparing value for field attributeName
Doc Value:  [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]
Solr Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]

Comparing value for field attributeDescription
Doc Value:  [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]
Solr Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]

Comparing value for field attributeUnit
Doc Value:  [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]
Solr Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]

Comparing value for field attribute
Doc Value:  [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]
Solr Value: [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]

Comparing value for field fileID
Doc Value:  https://cn.dataone.org/cn/v2/resolve/peggym.130.4
Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4

Comparing value for field text
Doc Value:  Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless
Solr Value: Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless

Comparing value for field geohash_1
Doc Value:  [9]
Solr Value: [9]

Comparing value for field geohash_2
Doc Value:  [9k]
Solr Value: [9k]

Comparing value for field geohash_3
Doc Value:  [9kd]
Solr Value: [9kd]

Comparing value for field geohash_4
Doc Value:  [9kd7]
Solr Value: [9kd7]

Comparing value for field geohash_5
Doc Value:  [9kd7y]
Solr Value: [9kd7y]

Comparing value for field geohash_6
Doc Value:  [9kd7ym]
Solr Value: [9kd7ym]

Comparing value for field geohash_7
Doc Value:  [9kd7ym0]
Solr Value: [9kd7ym0]

Comparing value for field geohash_8
Doc Value:  [9kd7ym0h]
Solr Value: [9kd7ym0h]

Comparing value for field geohash_9
Doc Value:  [9kd7ym0hc]
Solr Value: [9kd7ym0hc]

Comparing value for field isService
Doc Value:  false
Solr Value: false

Comparing value for field id
Doc Value:  peggym.130.4
Solr Value: peggym.130.4

Comparing value for field identifier
Doc Value:  peggym.130.4
Solr Value: peggym.130.4

Comparing value for field seriesId
Doc Value:  peggym.130
Solr Value: peggym.130

Comparing value for field formatId
Doc Value:  eml://ecoinformatics.org/eml-2.1.0
Solr Value: eml://ecoinformatics.org/eml-2.1.0

Comparing value for field formatType
Doc Value:  METADATA
Solr Value: METADATA

Comparing value for field size
Doc Value:  36281
Solr Value: 36281

Comparing value for field checksum
Doc Value:  24426711d5385a9ffa583a13d07af2502884932f
Solr Value: 24426711d5385a9ffa583a13d07af2502884932f

Comparing value for field submitter
Doc Value:  dataone_integration_test_user
Solr Value: dataone_integration_test_user

Comparing value for field checksumAlgorithm
Doc Value:  SHA-1
Solr Value: SHA-1

Comparing value for field rightsHolder
Doc Value:  dataone_integration_test_user
Solr Value: dataone_integration_test_user

Comparing value for field replicationAllowed
Doc Value:  true
Solr Value: true

Comparing value for field obsoletes
Doc Value:  peggym.130.3
Solr Value: peggym.130.3

Comparing value for field dateUploaded
Doc Value:  Wed Aug 31 15:59:50 GMT 2011
Solr Value: Wed Aug 31 15:59:50 GMT 2011

Comparing value for field dateModified
Doc Value:  Wed Aug 31 15:59:50 GMT 2011
Solr Value: Wed Aug 31 15:59:50 GMT 2011

Comparing value for field datasource
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field authoritativeMN
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field readPermission
Doc Value:  [public, dataone_public_user, dataone_test_user]
Solr Value: [public, dataone_public_user, dataone_test_user]

Comparing value for field writePermission
Doc Value:  [dataone_integration_test_user]
Solr Value: [dataone_integration_test_user]

Comparing value for field isPublic
Doc Value:  true
Solr Value: true

Comparing value for field dataUrl
Doc Value:  https://cn.dataone.org/cn/v2/resolve/peggym.130.4
Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4

[WARNING] Tests run: 4, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 1.679 s - in org.dataone.cn.index.SolrIndexFieldTest
[INFO] Running org.dataone.cn.index.SolrFieldDublinCoreTest
[ERROR] 2019-11-12 04:26:04,415 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) 
[ERROR] 2019-11-12 04:26:04,415 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null
[ERROR] 2019-11-12 04:26:04,415 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) 
[ERROR] 2019-11-12 04:26:04,415 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null
[ERROR] 2019-11-12 04:26:04,416 [main]  (org.dataone.cn.indexer.parser.TemporalPeriodSolrField:getFields:79) Couldn't extract 'start' or 'end' date for pid dcterms_spatial_no_namespace. Temporal pattern of type period needs to contain at least one of these. Value was: 
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.dataone.cn.index.SolrFieldDublinCoreTest
[INFO] Running org.dataone.cn.index.SolrTokenenizerTest
Creating dataDir: /tmp/org.dataone.cn.index.SolrTokenenizerTest_760B850B8CF8E35C-001/init-core-data-001
[ INFO] 2019-11-12 04:26:04,639 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:04,743 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:04,765 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:04,939 [coreLoadExecutor-35-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:04,940 [coreLoadExecutor-35-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:04,957 [coreLoadExecutor-35-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:04,965 [coreLoadExecutor-35-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ WARN] 2019-11-12 04:26:04,974 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:04,974 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:04,975 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2}
[ WARN] 2019-11-12 04:26:04,975 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:04,975 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:04,979 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:04,979 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@7ebd0922
[ INFO] 2019-11-12 04:26:04,980 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:04,988 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:04,988 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@501a5c54
[ INFO] 2019-11-12 04:26:04,989 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:04,989 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:05,022 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,023 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@345f7f2
[ INFO] 2019-11-12 04:26:05,023 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:05,025 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,025 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@69614f02
[ INFO] 2019-11-12 04:26:05,026 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:05,026 [TEST-SolrTokenenizerTest.testTokenizingParentheses-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:05,265 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:05,344 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:05,354 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:05,364 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:05,364 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:05,364 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=4}
[ WARN] 2019-11-12 04:26:05,365 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:05,365 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:05,368 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,369 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@55f86041
[ INFO] 2019-11-12 04:26:05,369 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:05,377 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,378 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2106f798
[ INFO] 2019-11-12 04:26:05,378 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:05,378 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:05,400 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,401 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2364312c
[ INFO] 2019-11-12 04:26:05,401 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:05,403 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,403 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@27b54524
[ INFO] 2019-11-12 04:26:05,404 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:05,404 [TEST-SolrTokenenizerTest.testTokenizingPeriod-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:05,661 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:05,736 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:05,752 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:05,763 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:05,763 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:05,764 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:05,764 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:05,764 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:05,767 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,768 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5e8b3fb9
[ INFO] 2019-11-12 04:26:05,768 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:05,776 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,776 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@359ecdc6
[ INFO] 2019-11-12 04:26:05,777 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:05,777 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:05,797 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,797 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5ad10c76
[ INFO] 2019-11-12 04:26:05,798 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:05,799 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:05,800 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@64f80dcf
[ INFO] 2019-11-12 04:26:05,800 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:05,800 [TEST-SolrTokenenizerTest.testTokenizingCaseSensitive-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:06,108 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:06,198 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:06,212 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:06,223 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:06,225 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:06,227 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:06,228 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:06,231 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:06,237 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,240 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@26c85644
[ INFO] 2019-11-12 04:26:06,243 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:06,283 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,285 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@216ee3ca
[ INFO] 2019-11-12 04:26:06,285 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:06,286 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:06,304 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,304 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@76e46fbe
[ INFO] 2019-11-12 04:26:06,304 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:06,306 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,306 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6ee3915c
[ INFO] 2019-11-12 04:26:06,306 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:06,306 [TEST-SolrTokenenizerTest.testQuotations-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
id = peggym.130.4
identifier = peggym.130.4
seriesId = peggym.130
formatId = eml://ecoinformatics.org/eml-2.1.0
formatType = METADATA
size = 36281
checksum = 24426711d5385a9ffa583a13d07af2502884932f
submitter = dataone_integration_test_user
checksumAlgorithm = SHA-1
rightsHolder = dataone_integration_test_user
replicationAllowed = true
obsoletes = peggym.130.3
dateUploaded = Wed Aug 31 15:59:50 ART 2011
updateDate = Wed Aug 31 15:59:50 ART 2011
dateModified = Wed Aug 31 15:59:50 ART 2011
datasource = test_documents
authoritativeMN = test_documents
readPermission = [public, dataone_public_user, dataone_test_user]
writePermission = [dataone_integration_test_user]
isPublic = true
dataUrl = https://cn.dataone.org/cn/v2/resolve/peggym.130.4
isService = false
abstract = This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.
keywords = [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]
title = Augrabies falls National Park census data.
southBoundCoord = 26.0
northBoundCoord = 26.0
westBoundCoord = -120.31121
eastBoundCoord = -120.31121
site = [Agulhas falls national Park]
beginDate = Thu Jan 01 00:00:00 ART 1998
endDate = Fri Feb 13 00:00:00 ART 2004
author = SANParks
authorSurName = SANParks
authorSurNameSort = SANParks
authorLastName = [SANParks, Garcia, Freeman]
investigator = [SANParks, Garcia, Freeman]
origin = [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]
contactOrganization = [SANParks, The Awesome Store]
genus = [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]
species = [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]
scientificName = [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]
attributeName = [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]
attributeDescription = [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]
attributeUnit = [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]
attribute = [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]
fileID = https://cn.dataone.org/cn/v2/resolve/peggym.130.4
geohash_1 = [9]
geohash_2 = [9k]
geohash_3 = [9kd]
geohash_4 = [9kd7]
geohash_5 = [9kd7y]
geohash_6 = [9kd7ym]
geohash_7 = [9kd7ym0]
geohash_8 = [9kd7ym0h]
geohash_9 = [9kd7ym0hc]
_version_ = 1649968693944254464
serviceCoupling = false
==========================================================
id = tao.12930.1
identifier = tao.12930.1
formatId = eml://ecoinformatics.org/eml-2.1.0
formatType = METADATA
size = 68457
checksum = bda6ad5bc761f1f9824ea38b249abde5fc721283
submitter = dataone_integration_test_user
checksumAlgorithm = SHA-1
rightsHolder = dataone_integration_test_user
replicationAllowed = true
dateUploaded = Wed Aug 31 15:59:47 ART 2011
updateDate = Wed Aug 31 15:59:47 ART 2011
dateModified = Wed Aug 31 15:59:47 ART 2011
datasource = test_documents
authoritativeMN = test_documents
readPermission = [dataone_public_user]
writePermission = [dataone_integration_test_user]
dataUrl = https://cn.dataone.org/cn/v2/resolve/tao.12930.1
isService = false
abstract = 
title = test again
author = tao
authorSurName = tao
authorSurNameSort = tao
authorLastName = [tao]
investigator = [tao]
origin = [tao]
fileID = https://cn.dataone.org/cn/v2/resolve/tao.12930.1
_version_ = 1649968693965225984
serviceCoupling = false
==========================================================
[ INFO] 2019-11-12 04:26:06,529 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:06,591 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:06,602 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:06,612 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:06,612 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:06,612 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:06,613 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:06,613 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:06,616 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,617 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5c692def
[ INFO] 2019-11-12 04:26:06,617 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:06,624 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,624 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4b21a13
[ INFO] 2019-11-12 04:26:06,625 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:06,625 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:06,640 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,641 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6de990
[ INFO] 2019-11-12 04:26:06,641 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:06,643 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,643 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@79ac7483
[ INFO] 2019-11-12 04:26:06,643 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:06,644 [TEST-SolrTokenenizerTest.testTokenizingComma-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:06,846 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:06,904 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:06,912 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:06,920 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:06,921 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:06,921 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:06,921 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:06,922 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:06,925 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,925 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@613d4be
[ INFO] 2019-11-12 04:26:06,925 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:06,932 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,933 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1884b5a0
[ INFO] 2019-11-12 04:26:06,933 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:06,933 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:06,949 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,951 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@687fa7fd
[ INFO] 2019-11-12 04:26:06,953 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:06,956 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:06,958 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3b455dd3
[ INFO] 2019-11-12 04:26:06,960 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:06,961 [TEST-SolrTokenenizerTest.testTokenizingPipe-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:07,192 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:07,277 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:07,291 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:07,300 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:07,300 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:07,300 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:07,300 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:07,301 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:07,304 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,304 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@349d5587
[ INFO] 2019-11-12 04:26:07,304 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:07,319 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,320 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2fc8f484
[ INFO] 2019-11-12 04:26:07,320 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:07,320 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:07,344 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,344 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6d986fd2
[ INFO] 2019-11-12 04:26:07,345 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:07,346 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,346 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3049b174
[ INFO] 2019-11-12 04:26:07,347 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:07,347 [TEST-SolrTokenenizerTest.testTokenizingHyphen-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:07,594 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:07,654 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:07,662 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:07,671 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:07,671 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:07,671 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:07,671 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:07,672 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:07,674 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,675 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@134ef55e
[ INFO] 2019-11-12 04:26:07,675 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:07,682 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,682 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@47f8efad
[ INFO] 2019-11-12 04:26:07,683 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:07,683 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:07,699 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,699 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@65fa94e6
[ INFO] 2019-11-12 04:26:07,699 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:07,701 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:07,701 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1b9d05b2
[ INFO] 2019-11-12 04:26:07,701 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:07,702 [TEST-SolrTokenenizerTest.testWildcardSerach-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:07,946 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:08,020 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:08,032 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:08,043 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:08,043 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:08,043 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=4}
[ WARN] 2019-11-12 04:26:08,044 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:08,044 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:08,047 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,047 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@24eb78f2
[ INFO] 2019-11-12 04:26:08,048 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:08,055 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,055 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3e4571ad
[ INFO] 2019-11-12 04:26:08,056 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:08,056 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:08,072 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,073 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1198b8da
[ INFO] 2019-11-12 04:26:08,073 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: tao.12930.1
[ INFO] 2019-11-12 04:26:08,074 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,075 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@61d92654
[ INFO] 2019-11-12 04:26:08,075 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:08,076 [TEST-SolrTokenenizerTest.testTokenizingContractionPreserved-seed#[760B850B8CF8E35C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.683 s - in org.dataone.cn.index.SolrTokenenizerTest
[INFO] Running org.dataone.cn.index.SolrFieldXPathDryad31Test
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.dataone.cn.index.SolrFieldXPathDryad31Test
[INFO] Running org.dataone.cn.index.SolrSearchIndexQueryTest
Creating dataDir: /tmp/org.dataone.cn.index.SolrSearchIndexQueryTest_1E44BF6E649CA10F-001/init-core-data-001
[ INFO] 2019-11-12 04:26:08,330 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:08,389 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:08,411 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:08,577 [coreLoadExecutor-45-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:08,578 [coreLoadExecutor-45-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:08,591 [coreLoadExecutor-45-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:08,597 [coreLoadExecutor-45-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ WARN] 2019-11-12 04:26:08,607 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:08,608 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:08,608 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=2}
[ WARN] 2019-11-12 04:26:08,609 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:08,609 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:08,612 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,612 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@9d6b43e
[ INFO] 2019-11-12 04:26:08,612 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:08,620 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,620 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3c71e173
[ INFO] 2019-11-12 04:26:08,621 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:08,621 [TEST-SolrSearchIndexQueryTest.testQueryForIdInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:08,827 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:08,884 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:08,891 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[ WARN] 2019-11-12 04:26:08,904 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:130) Request URI: null
[ WARN] 2019-11-12 04:26:08,904 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:131) Response size: 1
[ WARN] 2019-11-12 04:26:08,905 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:132) First response result: responseHeader = {status=0,QTime=3}
[ WARN] 2019-11-12 04:26:08,905 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:133) Response status code: 0
[ WARN] 2019-11-12 04:26:08,905 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.index.DataONESolrJettyTestBase:sendSolrDeleteAll:134) Deleted All...
[ INFO] 2019-11-12 04:26:08,912 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,912 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5694df02
[ INFO] 2019-11-12 04:26:08,913 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:08,921 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:08,921 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@b065b6d
[ INFO] 2019-11-12 04:26:08,922 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:08,922 [TEST-SolrSearchIndexQueryTest.testQueryForWordInAbstractInFullTextField-seed#[1E44BF6E649CA10F]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.833 s - in org.dataone.cn.index.SolrSearchIndexQueryTest
[INFO] Running org.dataone.cn.index.SolrIndexBatchAddTest
Creating dataDir: /tmp/org.dataone.cn.index.SolrIndexBatchAddTest_31258F6BAA91D8E8-001/init-core-data-001
[ INFO] 2019-11-12 04:26:09,217 [TEST-SolrIndexBatchAddTest.testBatchAddCorrect-seed#[31258F6BAA91D8E8]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:09,307 [TEST-SolrIndexBatchAddTest.testBatchAddCorrect-seed#[31258F6BAA91D8E8]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:09,327 [TEST-SolrIndexBatchAddTest.testBatchAddCorrect-seed#[31258F6BAA91D8E8]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:09,475 [coreLoadExecutor-55-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:09,476 [coreLoadExecutor-55-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:09,488 [coreLoadExecutor-55-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:09,494 [coreLoadExecutor-55-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ INFO] 2019-11-12 04:26:09,680 [TEST-SolrIndexBatchAddTest.testBatchAddRuntime-seed#[31258F6BAA91D8E8]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:09,738 [TEST-SolrIndexBatchAddTest.testBatchAddRuntime-seed#[31258F6BAA91D8E8]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:09,745 [TEST-SolrIndexBatchAddTest.testBatchAddRuntime-seed#[31258F6BAA91D8E8]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.798 s - in org.dataone.cn.index.SolrIndexBatchAddTest
[INFO] Running org.dataone.cn.index.SolrFieldDataCiteTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.dataone.cn.index.SolrFieldDataCiteTest
[INFO] Running org.dataone.cn.index.SolrFieldXPathEmlTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 s - in org.dataone.cn.index.SolrFieldXPathEmlTest
[INFO] Running org.dataone.cn.index.HazelcastClientFactoryTest
ERROR IN SolrLogFormatter! original message:Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.config.ClasspathXmlConfig.<init>(ClasspathXmlConfig.java:35)
	at com.hazelcast.config.ClasspathXmlConfig.<init>(ClasspathXmlConfig.java:30)
	at org.dataone.cn.hazelcast.HazelcastClientFactory.getHazelcastClientUsingConfig(HazelcastClientFactory.java:145)
	at org.dataone.cn.hazelcast.HazelcastClientFactory.getStorageClient(HazelcastClientFactory.java:93)
	at org.dataone.cn.hazelcast.HazelcastClientFactory.getSystemMetadataMap(HazelcastClientFactory.java:71)
	at org.dataone.cn.index.HazelcastClientFactoryTest.testSystemMetadataMap(HazelcastClientFactoryTest.java:66)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:272)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:236)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:386)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:323)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:143)
ERROR IN SolrLogFormatter! original message:HazelcastClient is STARTING
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83)
	at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] 5720 is accepting socket connection from /127.0.0.1:44614
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.nio.SocketAcceptor.log(SocketAcceptor.java:142)
	at com.hazelcast.nio.SocketAcceptor.log(SocketAcceptor.java:138)
	at com.hazelcast.nio.SocketAcceptor.access$000(SocketAcceptor.java:28)
	at com.hazelcast.nio.SocketAcceptor$1.run(SocketAcceptor.java:111)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
	at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] 5720 accepted socket connection from /127.0.0.1:44614
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.nio.ConnectionManager.log(ConnectionManager.java:475)
	at com.hazelcast.nio.ConnectionManager.assignSocketChannel(ConnectionManager.java:260)
	at com.hazelcast.nio.SocketAcceptor$1.run(SocketAcceptor.java:122)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
	at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] received auth from Connection [/127.0.0.1:44614 -> null] live=true, client=true, type=CLIENT, successfully authenticated
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.ClientHandlerService$ClientAuthenticateHandler.processCall(ClientHandlerService.java:852)
	at com.hazelcast.impl.ClientHandlerService$ClientOperationHandler.handle(ClientHandlerService.java:1565)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:57)
	at com.hazelcast.impl.ClientRequestHandler$1.run(ClientRequestHandler.java:54)
	at com.hazelcast.impl.ClientRequestHandler.doRun(ClientRequestHandler.java:63)
	at com.hazelcast.impl.FallThroughRunnable.run(FallThroughRunnable.java:22)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
	at com.hazelcast.impl.ExecutorThreadFactory$1.run(ExecutorThreadFactory.java:38)
ERROR IN SolrLogFormatter! original message:HazelcastClient is CLIENT_CONNECTION_OPENING
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83)
	at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:HazelcastClient is CLIENT_CONNECTION_OPENED
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83)
	at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:HazelcastClient is STARTED
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:46)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:38)
	at com.hazelcast.client.LifecycleServiceClientImpl.fireLifecycleEvent(LifecycleServiceClientImpl.java:83)
	at com.hazelcast.client.LifecycleServiceClientImpl$1.call(LifecycleServiceClientImpl.java:72)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ERROR IN SolrLogFormatter! original message:[127.0.0.1]:5720 [DataONEBuildTest] Initializing cluster partition table first arrangement...
	Exception: java.lang.NullPointerException
	at org.apache.solr.SolrLogFormatter$Method.hashCode(SolrLogFormatter.java:63)
	at java.util.HashMap.hash(HashMap.java:339)
	at java.util.HashMap.get(HashMap.java:557)
	at org.apache.solr.SolrLogFormatter.getShortClassName(SolrLogFormatter.java:279)
	at org.apache.solr.SolrLogFormatter._format(SolrLogFormatter.java:165)
	at org.apache.solr.SolrLogFormatter.format(SolrLogFormatter.java:116)
	at java.util.logging.StreamHandler.publish(StreamHandler.java:211)
	at java.util.logging.ConsoleHandler.publish(ConsoleHandler.java:116)
	at java.util.logging.Logger.log(Logger.java:738)
	at com.hazelcast.logging.StandardLoggerFactory$StandardLogger.log(StandardLoggerFactory.java:50)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:146)
	at com.hazelcast.logging.LoggingServiceImpl$DefaultLogger.log(LoggingServiceImpl.java:130)
	at com.hazelcast.impl.PartitionManager.firstArrangement(PartitionManager.java:160)
	at com.hazelcast.impl.PartitionManager.getOwner(PartitionManager.java:145)
	at com.hazelcast.impl.PartitionServiceImpl$3.process(PartitionServiceImpl.java:143)
	at com.hazelcast.cluster.ClusterService.processProcessable(ClusterService.java:190)
	at com.hazelcast.cluster.ClusterService.dequeueProcessables(ClusterService.java:256)
	at com.hazelcast.cluster.ClusterService.run(ClusterService.java:201)
	at java.lang.Thread.run(Thread.java:748)
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.115 s - in org.dataone.cn.index.HazelcastClientFactoryTest
[INFO] Running org.dataone.cn.index.SolrFieldPortalTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 s - in org.dataone.cn.index.SolrFieldPortalTest
[INFO] Running org.dataone.cn.index.SolrIndexReprocessTest
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.005 s - in org.dataone.cn.index.SolrIndexReprocessTest
[INFO] Running org.dataone.cn.index.TestResourceMapIndexTask
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.dataone.cn.index.TestResourceMapIndexTask
[INFO] Running org.dataone.cn.index.SolrFieldXPathFgdcTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.02 s - in org.dataone.cn.index.SolrFieldXPathFgdcTest
[INFO] Running org.dataone.cn.index.InvalidXmlCharTest
Hibernate: select indextask0_.id as id2_, indextask0_.dateSysMetaModified as dateSysM2_2_, indextask0_.deleted as deleted2_, indextask0_.formatId as formatId2_, indextask0_.nextExecution as nextExec5_2_, indextask0_.objectPath as objectPath2_, indextask0_.pid as pid2_, indextask0_.priority as priority2_, indextask0_.status as status2_, indextask0_.sysMetadata as sysMeta10_2_, indextask0_.taskModifiedDate as taskMod11_2_, indextask0_.tryCount as tryCount2_, indextask0_.version as version2_ from index_task indextask0_ where indextask0_.pid=?
 
field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
field value: text/plain
field value: DATA
field value: 684336
field value: 4504b4dd97f2d7a4766dfaaa3f968ec2
field value: CN=testRightsHolder,DC=dataone,DC=org
field value: MD5
field value: CN=testRightsHolder,DC=dataone,DC=org
field value: false
field value: 2012-03-07T17:26:09.962Z
field value: 2012-03-07T17:27:22.879Z
field value: urn:node:DEMO3
field value: urn:node:DEMO3
field value: urn:node:DEMO3
field value: completed
field value: 2012-03-07T00:00:00.000Z
field value: public
field value: true
field value: https://cn.dataone.org/cn/v2/resolve/testMNodeTier3%3A2012679267486_common-bmp-doc-example-%E0%B8%89%E0%B8%B1%E0%B8%99%E0%B8%81%E0%B8%B4%E0%B8%99%E0%B8%81%E0%B8%A3%E0%B8%B0%E0%B8%88%E0%B8%81%E0%B9%84%E0%B8%94%E0%B9%89
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
Hibernate: select indextask0_.id as id2_, indextask0_.dateSysMetaModified as dateSysM2_2_, indextask0_.deleted as deleted2_, indextask0_.formatId as formatId2_, indextask0_.nextExecution as nextExec5_2_, indextask0_.objectPath as objectPath2_, indextask0_.pid as pid2_, indextask0_.priority as priority2_, indextask0_.status as status2_, indextask0_.sysMetadata as sysMeta10_2_, indextask0_.taskModifiedDate as taskMod11_2_, indextask0_.tryCount as tryCount2_, indextask0_.version as version2_ from index_task indextask0_ where indextask0_.pid=?
 
field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
field value: text/plain
field value: DATA
field value: 684336
field value: 4504b4dd97f2d7a4766dfaaa3f968ec2
field value: CN=testRightsHolder,DC=dataone,DC=org
field value: MD5
field value: CN=testRightsHolder,DC=dataone,DC=org
field value: false
field value: 2012-03-07T17:26:09.962Z
field value: 2012-03-07T17:27:22.879Z
field value: urn:node:DEMO3
field value: urn:node:DEMO3
field value: urn:node:DEMO3
field value: completed
field value: 2012-03-07T00:00:00.000Z
field value: public
field value: true
field value: https://cn.dataone.org/cn/v2/resolve/testMNodeTier3%3A2012679267486_common-bmp-doc-example-%E0%B8%89%E0%B8%B1%E0%B8%99%E0%B8%81%E0%B8%B4%E0%B8%99%E0%B8%81%E0%B8%A3%E0%B8%B0%E0%B8%88%E0%B8%81%E0%B9%84%E0%B8%94%E0%B9%89
 
field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
field value: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
field value: text/plain
field value: DATA
field value: 684336
field value: 4504b4dd97f2d7a4766dfaaa3f968ec2
field value: CN=testRightsHolder,DC=dataone,DC=org
field value: MD5
field value: CN=testRightsHolder,DC=dataone,DC=org
field value: false
field value: 2012-03-07T17:26:09.962Z
field value: 2012-03-07T17:27:22.879Z
field value: urn:node:DEMO3
field value: urn:node:DEMO3
field value: urn:node:DEMO3
field value: completed
field value: 2012-03-07T00:00:00.000Z
field value: public
field value: true
field value: https://cn.dataone.org/cn/v2/resolve/testMNodeTier3%3A2012679267486_common-bmp-doc-example-%E0%B8%89%E0%B8%B1%E0%B8%99%E0%B8%81%E0%B8%B4%E0%B8%99%E0%B8%81%E0%B8%A3%E0%B8%B0%E0%B8%88%E0%B8%81%E0%B9%84%E0%B8%94%E0%B9%89
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.395 s - in org.dataone.cn.index.InvalidXmlCharTest
[INFO] Running org.dataone.cn.index.SolrFieldIsotc211Test
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.203 s - in org.dataone.cn.index.SolrFieldIsotc211Test
[INFO] Running org.dataone.cn.indexer.annotation.EmlAnnotationSubprocessorTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.077 s - in org.dataone.cn.indexer.annotation.EmlAnnotationSubprocessorTest
[INFO] Running org.dataone.cn.indexer.annotation.AnnotatorSubprocessorTest
[ INFO] 2019-11-12 04:26:12,824 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:12,887 [main]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.276 s - in org.dataone.cn.indexer.annotation.AnnotatorSubprocessorTest
[INFO] Running org.dataone.cn.indexer.annotation.SolrIndexAnnotatorTest
Creating dataDir: /tmp/org.dataone.cn.indexer.annotation.SolrIndexAnnotatorTest_C350585D572252C-001/init-core-data-001
[ INFO] 2019-11-12 04:26:13,110 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:13,169 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:13,192 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:13,327 [coreLoadExecutor-65-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:13,327 [coreLoadExecutor-65-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:13,338 [coreLoadExecutor-65-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:13,343 [coreLoadExecutor-65-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ INFO] 2019-11-12 04:26:13,348 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:13,349 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@723fc1ec
[ INFO] 2019-11-12 04:26:13,349 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: peggym.130.4
[ INFO] 2019-11-12 04:26:13,357 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:13,358 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1eac5062
[ INFO] 2019-11-12 04:26:13,358 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:13,359 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Comparing value for field abstract
Doc Value:  This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.
Solr Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.

Comparing value for field keywords
Doc Value:  [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]
Solr Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]

Comparing value for field title
Doc Value:  Augrabies falls National Park census data.
Solr Value: Augrabies falls National Park census data.

Comparing value for field southBoundCoord
Doc Value:  26.0
Solr Value: 26.0

Comparing value for field northBoundCoord
Doc Value:  26.0
Solr Value: 26.0

Comparing value for field westBoundCoord
Doc Value:  -120.31121
Solr Value: -120.31121

Comparing value for field eastBoundCoord
Doc Value:  -120.31121
Solr Value: -120.31121

Comparing value for field site
Doc Value:  [Agulhas falls national Park]
Solr Value: [Agulhas falls national Park]

Comparing value for field beginDate
Doc Value:  Thu Jan 01 00:00:00 CET 1998
Solr Value: Thu Jan 01 00:00:00 CET 1998

Comparing value for field endDate
Doc Value:  Fri Feb 13 00:00:00 CET 2004
Solr Value: Fri Feb 13 00:00:00 CET 2004

Comparing value for field author
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorSurName
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorSurNameSort
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorLastName
Doc Value:  [SANParks, Garcia, Freeman]
Solr Value: [SANParks, Garcia, Freeman]

Comparing value for field investigator
Doc Value:  [SANParks, Garcia, Freeman]
Solr Value: [SANParks, Garcia, Freeman]

Comparing value for field origin
Doc Value:  [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]
Solr Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]

Comparing value for field contactOrganization
Doc Value:  [SANParks, The Awesome Store]
Solr Value: [SANParks, The Awesome Store]

Comparing value for field genus
Doc Value:  [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]
Solr Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]

Comparing value for field species
Doc Value:  [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]
Solr Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]

Comparing value for field scientificName
Doc Value:  [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]
Solr Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]

Comparing value for field attributeName
Doc Value:  [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]
Solr Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]

Comparing value for field attributeDescription
Doc Value:  [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]
Solr Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]

Comparing value for field attributeUnit
Doc Value:  [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]
Solr Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]

Comparing value for field attribute
Doc Value:  [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]
Solr Value: [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]

Comparing value for field fileID
Doc Value:  https://cn.dataone.org/cn/v2/resolve/peggym.130.4
Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4

Comparing value for field text
Doc Value:  Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless
Solr Value: Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless

Comparing value for field geohash_1
Doc Value:  [9]
Solr Value: [9]

Comparing value for field geohash_2
Doc Value:  [9k]
Solr Value: [9k]

Comparing value for field geohash_3
Doc Value:  [9kd]
Solr Value: [9kd]

Comparing value for field geohash_4
Doc Value:  [9kd7]
Solr Value: [9kd7]

Comparing value for field geohash_5
Doc Value:  [9kd7y]
Solr Value: [9kd7y]

Comparing value for field geohash_6
Doc Value:  [9kd7ym]
Solr Value: [9kd7ym]

Comparing value for field geohash_7
Doc Value:  [9kd7ym0]
Solr Value: [9kd7ym0]

Comparing value for field geohash_8
Doc Value:  [9kd7ym0h]
Solr Value: [9kd7ym0h]

Comparing value for field geohash_9
Doc Value:  [9kd7ym0hc]
Solr Value: [9kd7ym0hc]

Comparing value for field isService
Doc Value:  false
Solr Value: false

Comparing value for field id
Doc Value:  peggym.130.4
Solr Value: peggym.130.4

Comparing value for field identifier
Doc Value:  peggym.130.4
Solr Value: peggym.130.4

Comparing value for field seriesId
Doc Value:  peggym.130
Solr Value: peggym.130

Comparing value for field formatId
Doc Value:  eml://ecoinformatics.org/eml-2.1.0
Solr Value: eml://ecoinformatics.org/eml-2.1.0

Comparing value for field formatType
Doc Value:  METADATA
Solr Value: METADATA

Comparing value for field size
Doc Value:  36281
Solr Value: 36281

Comparing value for field checksum
Doc Value:  24426711d5385a9ffa583a13d07af2502884932f
Solr Value: 24426711d5385a9ffa583a13d07af2502884932f

Comparing value for field submitter
Doc Value:  dataone_integration_test_user
Solr Value: dataone_integration_test_user

Comparing value for field checksumAlgorithm
Doc Value:  SHA-1
Solr Value: SHA-1

Comparing value for field rightsHolder
Doc Value:  dataone_integration_test_user
Solr Value: dataone_integration_test_user

Comparing value for field replicationAllowed
Doc Value:  true
Solr Value: true

Comparing value for field obsoletes
Doc Value:  peggym.130.3
Solr Value: peggym.130.3

Comparing value for field dateUploaded
Doc Value:  Wed Aug 31 15:59:50 CEST 2011
Solr Value: Wed Aug 31 15:59:50 CEST 2011

Comparing value for field dateModified
Doc Value:  Wed Aug 31 15:59:50 CEST 2011
Solr Value: Wed Aug 31 15:59:50 CEST 2011

Comparing value for field datasource
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field authoritativeMN
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field readPermission
Doc Value:  [public, dataone_public_user, dataone_test_user]
Solr Value: [public, dataone_public_user, dataone_test_user]

Comparing value for field writePermission
Doc Value:  [dataone_integration_test_user]
Solr Value: [dataone_integration_test_user]

Comparing value for field isPublic
Doc Value:  true
Solr Value: true

Comparing value for field dataUrl
Doc Value:  https://cn.dataone.org/cn/v2/resolve/peggym.130.4
Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4

[ INFO] 2019-11-12 04:26:13,389 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:13,390 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4ee76a13
[ INFO] 2019-11-12 04:26:13,390 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: annotation.130.4
[ INFO] 2019-11-12 04:26:13,403 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 2
[ INFO] 2019-11-12 04:26:13,403 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@362f967a
[ INFO] 2019-11-12 04:26:13,404 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: annotation.130.4
[ INFO] 2019-11-12 04:26:13,405 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: peggym.130.4
[ INFO] 2019-11-12 04:26:13,409 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = peggym.130.4
[ INFO] 2019-11-12 04:26:13,409 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968701361881088
[ INFO] 2019-11-12 04:26:13,410 [TEST-SolrIndexAnnotatorTest.testSystemMetadataEml210AndAnnotation-seed#[C350585D572252C]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
FIELD NAME=id, VALUE=peggym.130.4
FIELD NAME=identifier, VALUE=peggym.130.4
FIELD NAME=seriesId, VALUE=peggym.130
FIELD NAME=formatId, VALUE=eml://ecoinformatics.org/eml-2.1.0
FIELD NAME=formatType, VALUE=METADATA
FIELD NAME=size, VALUE=36281
FIELD NAME=checksum, VALUE=24426711d5385a9ffa583a13d07af2502884932f
FIELD NAME=submitter, VALUE=dataone_integration_test_user
FIELD NAME=checksumAlgorithm, VALUE=SHA-1
FIELD NAME=rightsHolder, VALUE=dataone_integration_test_user
FIELD NAME=replicationAllowed, VALUE=true
FIELD NAME=obsoletes, VALUE=peggym.130.3
FIELD NAME=dateUploaded, VALUE=Wed Aug 31 15:59:50 CEST 2011
FIELD NAME=dateModified, VALUE=Wed Aug 31 15:59:50 CEST 2011
FIELD NAME=datasource, VALUE=test_documents
FIELD NAME=authoritativeMN, VALUE=test_documents
FIELD NAME=readPermission, VALUE=[public, dataone_public_user, dataone_test_user]
FIELD NAME=writePermission, VALUE=[dataone_integration_test_user]
FIELD NAME=isPublic, VALUE=true
FIELD NAME=dataUrl, VALUE=https://cn.dataone.org/cn/v2/resolve/peggym.130.4
FIELD NAME=isService, VALUE=false
FIELD NAME=abstract, VALUE=This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.
FIELD NAME=keywords, VALUE=[SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]
FIELD NAME=title, VALUE=Augrabies falls National Park census data.
FIELD NAME=southBoundCoord, VALUE=26.0
FIELD NAME=northBoundCoord, VALUE=26.0
FIELD NAME=westBoundCoord, VALUE=-120.31121
FIELD NAME=eastBoundCoord, VALUE=-120.31121
FIELD NAME=site, VALUE=[Agulhas falls national Park]
FIELD NAME=beginDate, VALUE=Thu Jan 01 00:00:00 CET 1998
FIELD NAME=endDate, VALUE=Fri Feb 13 00:00:00 CET 2004
FIELD NAME=author, VALUE=SANParks
FIELD NAME=authorSurName, VALUE=SANParks
FIELD NAME=authorSurNameSort, VALUE=SANParks
FIELD NAME=authorLastName, VALUE=[SANParks, Garcia, Freeman]
FIELD NAME=investigator, VALUE=[SANParks, Garcia, Freeman]
FIELD NAME=origin, VALUE=[SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]
FIELD NAME=contactOrganization, VALUE=[SANParks, The Awesome Store]
FIELD NAME=genus, VALUE=[Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]
FIELD NAME=species, VALUE=[marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]
FIELD NAME=scientificName, VALUE=[Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]
FIELD NAME=attributeName, VALUE=[ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]
FIELD NAME=attributeDescription, VALUE=[The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]
FIELD NAME=attributeUnit, VALUE=[dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]
FIELD NAME=attribute, VALUE=[ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]
FIELD NAME=fileID, VALUE=https://cn.dataone.org/cn/v2/resolve/peggym.130.4
FIELD NAME=text, VALUE=Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless
FIELD NAME=geohash_1, VALUE=[9]
FIELD NAME=geohash_2, VALUE=[9k]
FIELD NAME=geohash_3, VALUE=[9kd]
FIELD NAME=geohash_4, VALUE=[9kd7]
FIELD NAME=geohash_5, VALUE=[9kd7y]
FIELD NAME=geohash_6, VALUE=[9kd7ym]
FIELD NAME=geohash_7, VALUE=[9kd7ym0]
FIELD NAME=geohash_8, VALUE=[9kd7ym0h]
FIELD NAME=geohash_9, VALUE=[9kd7ym0hc]
FIELD NAME=_version_, VALUE=1649968701420601344
FIELD NAME=sem_annotation, VALUE=[http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass]
FIELD NAME=sem_annotated_by, VALUE=[annotation.130.4]
FIELD NAME=serviceCoupling, VALUE=false
annotationValue: http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass
Comparing value for field abstract
Doc Value:  This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.
Solr Value: This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.

Comparing value for field keywords
Doc Value:  [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]
Solr Value: [SANParks, South Africa, Augrabies Falls National Park,South Africa, Census data, EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature]

Comparing value for field title
Doc Value:  Augrabies falls National Park census data.
Solr Value: Augrabies falls National Park census data.

Comparing value for field southBoundCoord
Doc Value:  26.0
Solr Value: 26.0

Comparing value for field northBoundCoord
Doc Value:  26.0
Solr Value: 26.0

Comparing value for field westBoundCoord
Doc Value:  -120.31121
Solr Value: -120.31121

Comparing value for field eastBoundCoord
Doc Value:  -120.31121
Solr Value: -120.31121

Comparing value for field site
Doc Value:  [Agulhas falls national Park]
Solr Value: [Agulhas falls national Park]

Comparing value for field beginDate
Doc Value:  Thu Jan 01 00:00:00 CET 1998
Solr Value: Thu Jan 01 00:00:00 CET 1998

Comparing value for field endDate
Doc Value:  Fri Feb 13 00:00:00 CET 2004
Solr Value: Fri Feb 13 00:00:00 CET 2004

Comparing value for field author
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorSurName
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorSurNameSort
Doc Value:  SANParks
Solr Value: SANParks

Comparing value for field authorLastName
Doc Value:  [SANParks, Garcia, Freeman]
Solr Value: [SANParks, Garcia, Freeman]

Comparing value for field investigator
Doc Value:  [SANParks, Garcia, Freeman]
Solr Value: [SANParks, Garcia, Freeman]

Comparing value for field origin
Doc Value:  [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]
Solr Value: [SANParks Freddy Garcia, Gordon Freeman, The Awesome Store]

Comparing value for field contactOrganization
Doc Value:  [SANParks, The Awesome Store]
Solr Value: [SANParks, The Awesome Store]

Comparing value for field genus
Doc Value:  [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]
Solr Value: [Antidorcas, Cercopithecus, Diceros, Equus, Giraffa, Oreotragus, Oryz, Papio, Taurotragus, Tragelaphus]

Comparing value for field species
Doc Value:  [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]
Solr Value: [marsupialis, aethiops, bicornis, hartmannae, camelopardalis, oreotragus, gazella, hamadryas, oryx, strepsiceros]

Comparing value for field scientificName
Doc Value:  [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]
Solr Value: [Antidorcas marsupialis, Cercopithecus aethiops, Diceros bicornis, Equus hartmannae, Giraffa camelopardalis, Oreotragus oreotragus, Oryz gazella, Papio hamadryas, Taurotragus oryx, Tragelaphus strepsiceros]

Comparing value for field attributeName
Doc Value:  [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]
Solr Value: [ID, Lat S, Long E, Date, Stratum, Transect, Species, LatS, LongE, Total, Juvenile, L/R, Species, Stratum, Date, SumOfTotal, SumOfJuvenile, Species, Date, SumOfTotal, SumOfJuvenile]

Comparing value for field attributeDescription
Doc Value:  [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]
Solr Value: [The ID, Lat S, Long E, The date, Stratum, Transect, The name of species, LatS, LongE, The total, Juvenile, L/R, The name of species, Stratum, The date, Sum of the total, Sum of juvenile, The name of species, The date, The sum of total, Sum of juvenile]

Comparing value for field attributeUnit
Doc Value:  [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]
Solr Value: [dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless, dimensionless]

Comparing value for field attribute
Doc Value:  [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]
Solr Value: [ID  The ID dimensionless, Lat S  Lat S dimensionless, Long E  Long E dimensionless, Date  The date, Stratum  Stratum dimensionless, Transect  Transect dimensionless, Species  The name of species, LatS  LatS dimensionless, LongE  LongE dimensionless, Total  The total dimensionless, Juvenile  Juvenile dimensionless, L/R  L/R dimensionless, Species  The name of species, Stratum  Stratum dimensionless, Date  The date, SumOfTotal  Sum of the total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless, Species  The name of species, Date  The date, SumOfTotal  The sum of total dimensionless, SumOfJuvenile  Sum of juvenile dimensionless]

Comparing value for field fileID
Doc Value:  https://cn.dataone.org/cn/v2/resolve/peggym.130.4
Solr Value: https://cn.dataone.org/cn/v2/resolve/peggym.130.4

Comparing value for field text
Doc Value:  Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless
Solr Value: Augrabies falls National Park census data.   SANParks    Garcia  Freddy   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa     Freeman  Gordon   SANParks  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    The Awesome Store  Regional Ecologists   Private Bag x402 Skukuza, 1350 South Africa    This metadata record fred, describes a 12-34 TT-12 long-term data document can't frank.  This is a test.  If this was not a lower, an abstract "double" or 'single' would be present in UPPER (parenthized) this location.   SANParks, South Africa  Augrabies Falls National Park,South Africa  Census data  EARTH SCIENCE : Oceans : Ocean Temperature : Water Temperature    Agulhas falls national Park   -120.311210  -120.311210  26.0  26.0       1998    2004-02-13       genus  Antidorcas   species  marsupialis  Hartmans Zebra     Genus  Cercopithecus   Species  aethiops  Vervet monkey     Genus  Diceros   Species  bicornis  Baboon     Genus  Equus   Species  hartmannae  Giraffe     Genus  Giraffa   Species  camelopardalis  Kudu     Genus  Oreotragus   Species  oreotragus  Gemsbok     Genus  Oryz   Species  gazella  Eland     Genus  Papio   Species  hamadryas     Genus  Taurotragus   Species  oryx  Black rhino     Genus  Tragelaphus   Species  strepsiceros  Klipspringer      1251095992100 peggym.130.4 ID Lat S Long E Date Stratum Transect Species LatS LongE Total Juvenile L/R SumOfTotal SumOfJuvenile The ID Lat S Long E The date Stratum Transect The name of species LatS LongE The total Juvenile L/R Sum of the total Sum of juvenile The sum of total dimensionless

Comparing value for field geohash_1
Doc Value:  [9]
Solr Value: [9]

Comparing value for field geohash_2
Doc Value:  [9k]
Solr Value: [9k]

Comparing value for field geohash_3
Doc Value:  [9kd]
Solr Value: [9kd]

Comparing value for field geohash_4
Doc Value:  [9kd7]
Solr Value: [9kd7]

Comparing value for field geohash_5
Doc Value:  [9kd7y]
Solr Value: [9kd7y]

Comparing value for field geohash_6
Doc Value:  [9kd7ym]
Solr Value: [9kd7ym]

Comparing value for field geohash_7
Doc Value:  [9kd7ym0]
Solr Value: [9kd7ym0]

Comparing value for field geohash_8
Doc Value:  [9kd7ym0h]
Solr Value: [9kd7ym0h]

Comparing value for field geohash_9
Doc Value:  [9kd7ym0hc]
Solr Value: [9kd7ym0hc]

Comparing value for field isService
Doc Value:  false
Solr Value: false

[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.555 s - in org.dataone.cn.indexer.annotation.SolrIndexAnnotatorTest
[INFO] Running org.dataone.cn.indexer.annotation.SolrIndexEmlAnnotationTest
Creating dataDir: /tmp/org.dataone.cn.indexer.annotation.SolrIndexEmlAnnotationTest_5E1DD9B9144ADEEE-001/init-core-data-001
[ INFO] 2019-11-12 04:26:13,680 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:13,752 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:13,776 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:14,002 [coreLoadExecutor-75-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:14,003 [coreLoadExecutor-75-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:14,013 [coreLoadExecutor-75-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:14,019 [coreLoadExecutor-75-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ INFO] 2019-11-12 04:26:14,022 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:14,023 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@40a866a
[ INFO] 2019-11-12 04:26:14,024 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: eml-test-doc
[ INFO] 2019-11-12 04:26:14,025 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:14,026 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@5e0bce50
[ INFO] 2019-11-12 04:26:14,026 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:14,034 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:14,034 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@7a912524
[ INFO] 2019-11-12 04:26:14,035 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:14,035 [TEST-SolrIndexEmlAnnotationTest.testSystemMetadataEml220AndAnnotation-seed#[5E1DD9B9144ADEEE]]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Comparing value for field abstract
Doc Value:  
Solr Value: 

Comparing value for field title
Doc Value:  EML Annotation Example
Solr Value: EML Annotation Example

Comparing value for field project
Doc Value:  [MY PROJECT]
Solr Value: [MY PROJECT]

Comparing value for field funding
Doc Value:  [SOME_RANDOM_FUNDING_INFO]
Solr Value: [SOME_RANDOM_FUNDING_INFO]

Comparing value for field funderName
Doc Value:  [My Funder]
Solr Value: [My Funder]

Comparing value for field funderIdentifier
Doc Value:  [MY_FUNDER]
Solr Value: [MY_FUNDER]

Comparing value for field awardNumber
Doc Value:  [AWARD1]
Solr Value: [AWARD1]

Comparing value for field awardTitle
Doc Value:  [An example award title]
Solr Value: [An example award title]

Comparing value for field author
Doc Value:  EML Annotator
Solr Value: EML Annotator

Comparing value for field authorGivenName
Doc Value:  EML
Solr Value: EML

Comparing value for field authorSurName
Doc Value:  Annotator
Solr Value: Annotator

Comparing value for field authorGivenNameSort
Doc Value:  EML
Solr Value: EML

Comparing value for field authorSurNameSort
Doc Value:  Annotator
Solr Value: Annotator

Comparing value for field authorLastName
Doc Value:  [Annotator]
Solr Value: [Annotator]

Comparing value for field investigator
Doc Value:  [Annotator]
Solr Value: [Annotator]

Comparing value for field origin
Doc Value:  [EML Annotator]
Solr Value: [EML Annotator]

Comparing value for field attributeName
Doc Value:  [SOME_ATTRIBUTE]
Solr Value: [SOME_ATTRIBUTE]

Comparing value for field attributeDescription
Doc Value:  [SOME_ATTRIBUTE's definition]
Solr Value: [SOME_ATTRIBUTE's definition]

Comparing value for field attribute
Doc Value:  [SOME_ATTRIBUTE  SOME_ATTRIBUTE's definition]
Solr Value: [SOME_ATTRIBUTE  SOME_ATTRIBUTE's definition]

Comparing value for field fileID
Doc Value:  https://cn.dataone.org/cn/v2/resolve/eml-test-doc
Solr Value: https://cn.dataone.org/cn/v2/resolve/eml-test-doc

Comparing value for field text
Doc Value:  EML Annotation Example   EML  Annotator     EML  Annotator    MY PROJECT    EML  Annotator   principalInvestigator   SOME_RANDOM_FUNDING_INFO   My Funder  MY_FUNDER  AWARD1  An example award title  https://example.org/someaward eml-test-doc SOME_ATTRIBUTE SOME_ATTRIBUTE's definition
Solr Value: EML Annotation Example   EML  Annotator     EML  Annotator    MY PROJECT    EML  Annotator   principalInvestigator   SOME_RANDOM_FUNDING_INFO   My Funder  MY_FUNDER  AWARD1  An example award title  https://example.org/someaward eml-test-doc SOME_ATTRIBUTE SOME_ATTRIBUTE's definition

Comparing value for field isService
Doc Value:  false
Solr Value: false

Comparing value for field id
Doc Value:  eml-test-doc
Solr Value: eml-test-doc

Comparing value for field identifier
Doc Value:  eml-test-doc
Solr Value: eml-test-doc

Comparing value for field formatId
Doc Value:  https://eml.ecoinformatics.org/eml-2.2.0
Solr Value: https://eml.ecoinformatics.org/eml-2.2.0

Comparing value for field formatType
Doc Value:  METADATA
Solr Value: METADATA

Comparing value for field size
Doc Value:  0
Solr Value: 0

Comparing value for field checksum
Doc Value:  12345
Solr Value: 12345

Comparing value for field submitter
Doc Value:  dataone_integration_test_user
Solr Value: dataone_integration_test_user

Comparing value for field checksumAlgorithm
Doc Value:  MD5
Solr Value: MD5

Comparing value for field rightsHolder
Doc Value:  dataone_integration_test_user
Solr Value: dataone_integration_test_user

Comparing value for field replicationAllowed
Doc Value:  true
Solr Value: true

Comparing value for field dateUploaded
Doc Value:  Wed Jul 31 15:59:47 AST 2019
Solr Value: Wed Jul 31 15:59:47 AST 2019

Comparing value for field dateModified
Doc Value:  Wed Jul 31 15:59:47 AST 2019
Solr Value: Wed Jul 31 15:59:47 AST 2019

Comparing value for field datasource
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field authoritativeMN
Doc Value:  test_documents
Solr Value: test_documents

Comparing value for field readPermission
Doc Value:  [public, dataone_public_user]
Solr Value: [public, dataone_public_user]

Comparing value for field writePermission
Doc Value:  [dataone_integration_test_user]
Solr Value: [dataone_integration_test_user]

Comparing value for field isPublic
Doc Value:  true
Solr Value: true

Comparing value for field dataUrl
Doc Value:  https://cn.dataone.org/cn/v2/resolve/eml-test-doc
Solr Value: https://cn.dataone.org/cn/v2/resolve/eml-test-doc

[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.598 s - in org.dataone.cn.indexer.annotation.SolrIndexEmlAnnotationTest
[INFO] Running org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest
[ INFO] 2019-11-12 04:26:14,399 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:14,459 [main]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
annotation: {
  "pid": "peggym.130.4", 
  "id": "annotation.130.4",   
  "field": "sem_annotation", 
  "reject": false, 
  "ranges": [
    {
      "start": "/section[1]/article[1]/form[1]/section[1]/div[1]/div[1]", 
      "end": "/section[1]/article[1]/form[1]/section[1]/div[1]/div[1]", 
      "startOffset": 0, 
      "endOffset": 4
    }
  ], 
  "permissions": {
    "read": [
      "group:__world__"
    ], 
    "delete": [], 
    "admin": [], 
    "update": []
  }, 
  "user": "CN=Benjamin Leinfelder A515,O=University of Chicago,C=US,DC=cilogon,DC=org", 
  "consumer": "metacat", 
  "updated": "2014-12-03T23:29:20.262152+00:00", 
  "quote": "Data", 
  "oa:Motivation": "oa:tagging", 
  "tags": ["http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass"], 
  "text": "Original annotation content", 
  "created": "2014-12-03T23:09:25.501665+00:00", 
  "uri": "https://cn-dev.test.dataone.org/cn/v1/object/peggym.130.4"
}
[ERROR] 2019-11-12 04:26:14,478 [main]  (org.dataone.cn.indexer.annotation.AnnotatorSubprocessor:processDocument:184) Unable to retrieve solr document: peggym.130.4.  Exception attempting to communicate with solr server.
java.io.IOException: org.apache.solr.client.solrj.SolrServerException: Server refused connection at: http://localhost:8983/solr/collection1
	at org.dataone.cn.indexer.solrhttp.SolrJClient.getDocumentsBySolrId(SolrJClient.java:561)
	at org.dataone.cn.indexer.solrhttp.SolrJClient.getDocumentsByD1Identifier(SolrJClient.java:495)
	at org.dataone.cn.indexer.solrhttp.SolrJClient.retrieveDocumentFromSolrServer(SolrJClient.java:815)
	at org.dataone.cn.indexer.annotation.AnnotatorSubprocessor.processDocument(AnnotatorSubprocessor.java:180)
	at org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest.compareFields(SolrFieldAnnotatorTest.java:119)
	at org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest.testAnnotationFields(SolrFieldAnnotatorTest.java:150)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
	at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:83)
	at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:231)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
	at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:174)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:272)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:236)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:386)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:323)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:143)
Caused by: org.apache.solr.client.solrj.SolrServerException: Server refused connection at: http://localhost:8983/solr/collection1
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:567)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:235)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:227)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:135)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:943)
	at org.apache.solr.client.solrj.SolrClient.getById(SolrClient.java:1174)
	at org.apache.solr.client.solrj.SolrClient.getById(SolrClient.java:1128)
	at org.apache.solr.client.solrj.SolrClient.getById(SolrClient.java:1144)
	at org.dataone.cn.indexer.solrhttp.SolrJClient.getDocumentsBySolrId(SolrJClient.java:555)
	... 35 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:589)
	at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117)
	at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
	at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
	at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
	at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
	at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:466)
	... 43 more
[ WARN] 2019-11-12 04:26:14,484 [main]  (org.dataone.cn.indexer.annotation.AnnotatorSubprocessor:processDocument:189) DID NOT LOCATE REFERENCED DOC: peggym.130.4
Checking value: peggym.130.4
in expected: [peggym.130.4]
Checking value: annotation.130.4
in expected: [annotation.130.4]
Checking value: http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass
in expected: [http://ecoinformatics.org/oboe/oboe.1.0/oboe-characteristics.owl#Mass, http://ecoinformatics.org/oboe/oboe.1.0/oboe-core.owl#PhysicalCharacteristic, http://ecoinformatics.org/oboe/oboe.1.0/oboe-core.owl#Characteristic, http://www.w3.org/2000/01/rdf-schema#Resource]
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.428 s - in org.dataone.cn.indexer.annotation.SolrFieldAnnotatorTest
[INFO] Running org.dataone.cn.indexer.annotation.SolrFieldEmlAnnotationTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.dataone.cn.indexer.annotation.SolrFieldEmlAnnotationTest
[INFO] Running org.dataone.cn.indexer.annotation.OntologyModelServiceTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.dataone.cn.indexer.annotation.OntologyModelServiceTest
[INFO] Running org.dataone.cn.indexer.XmlDocumentUtilityTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.indexer.XmlDocumentUtilityTest
[INFO] Running org.dataone.cn.indexer.parser.TestUpdateAssembler
[ INFO] 2019-11-12 04:26:14,505 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
Name: formatId
Modifier: set
Value: emlversion2

Name: title
Modifier: set
Value: bestPublicationYet

Name: _version_
Modifier: null
Value: 1234567890

Name: id
Modifier: null
Value: MD

<doc><field name="identifier" modifier="set" >PID</field><field name="_version_" >1234567890</field><field name="id" >MD</field></doc>
DATA
MD
ORE
<doc><field name="id" >MD</field><field name="resourceMap" >ORE</field><field name="documents" >DATA</field><field name="_version_" >-1</field></doc>
Name: id
Modifier: null
Value: MD

Name: formatId
Modifier: null
Value: emlversion2

Name: title
Modifier: null
Value: bestPublicationYet

Name: _version_
Modifier: null
Value: -1

<doc><field name="id" >MD</field><field name="resourceMap" >ORE</field><field name="documents" >DATA</field><field name="_version_" >-1</field></doc>
id modifier: null
resourceMap Modifier: null
documents Modifier: null
_version_ Modifier: null
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.026 s - in org.dataone.cn.indexer.parser.TestUpdateAssembler
[INFO] Running org.dataone.cn.indexer.convert.TemporalPeriodParsingUtilityTest
[ERROR] 2019-11-12 04:26:14,526 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) 
[ERROR] 2019-11-12 04:26:14,527 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: 2005
[ERROR] 2019-11-12 04:26:14,529 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) 
[ERROR] 2019-11-12 04:26:14,529 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null
[ERROR] 2019-11-12 04:26:14,531 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) 
[ERROR] 2019-11-12 04:26:14,532 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: 2000
[ERROR] 2019-11-12 04:26:14,533 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:parseDateTime:198) 
[ERROR] 2019-11-12 04:26:14,533 [main]  (org.dataone.cn.indexer.parser.utility.TemporalPeriodParsingUtility:formatDate:176) Date string could not be parsed: null
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.dataone.cn.indexer.convert.TemporalPeriodParsingUtilityTest
[INFO] Running org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeConverterTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeConverterTest
[INFO] Running org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeDocumentServiceTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 s - in org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypeDocumentServiceTest
[INFO] Running org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypesParserTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.dataone.cn.indexer.convert.MemberNodeServiceRegistrationTypesParserTest
[INFO] Running org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest
Creating dataDir: /tmp/org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest_90A4EAC4366D4ED4-001/init-core-data-001
[ INFO] 2019-11-12 04:26:14,847 [TEST-RdfXmlProcessorTest.testInit-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:14,904 [TEST-RdfXmlProcessorTest.testInit-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:14,924 [TEST-RdfXmlProcessorTest.testInit-seed#[90A4EAC4366D4ED4]]  (org.apache.solr.core.SolrResourceLoader:addToClassLoader:191) Can't find (or read) directory to add to classloader: lib (resolved as: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/lib).
[ WARN] 2019-11-12 04:26:15,050 [coreLoadExecutor-85-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class SpatialRecursivePrefixTreeFieldType
[ WARN] 2019-11-12 04:26:15,051 [coreLoadExecutor-85-thread-1]  (org.apache.solr.schema.AbstractSpatialFieldType:init:128) units parameter is deprecated, please use distanceUnits instead for field types with class BBoxField
[ WARN] 2019-11-12 04:26:15,061 [coreLoadExecutor-85-thread-1]  (org.apache.solr.core.SolrCore:initIndex:541) [collection1] Solr index directory '/var/lib/jenkins/jobs/d1_cn_index_processor/workspace/./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/data/index' doesn't exist. Creating new index...
[ WARN] 2019-11-12 04:26:15,068 [coreLoadExecutor-85-thread-1]  (org.apache.solr.rest.ManagedResource:reloadFromStorage:182) No stored data found for /schema/analysis/synonyms/english
[ INFO] 2019-11-12 04:26:15,261 [TEST-RdfXmlProcessorTest.testPartFields-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:15,315 [TEST-RdfXmlProcessorTest.testPartFields-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:15,322 [TEST-RdfXmlProcessorTest.testPartFields-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
referencedPid: urn:uuid:f18812ac-7f4f-496c-82cc-3f4f54830274
referencedPid: urn:uuid:27ae3627-be62-4963-859a-8c96d940cadc
[ INFO] 2019-11-12 04:26:15,903 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:15,958 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:15,967 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing\-DataProcessor.1.m
[ WARN] 2019-11-12 04:26:15,980 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:363)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing\-Configure.1.m
[ WARN] 2019-11-12 04:26:16,018 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing-Configure.1.m to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:368)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing\-schedule_AW02XX_001CTDXXXXR00_processing.1.m
[ WARN] 2019-11-12 04:26:16,034 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:372)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
the filter is id:ala\-wai\-canal\-ns02\-matlab\-processing.eml.1.xml
[ WARN] 2019-11-12 04:26:16,048 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing.eml.1.xml to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:379)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
the filter is id:ala\-wai\-ns02\-image\-data\-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg
[ WARN] 2019-11-12 04:26:16,064 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:384)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
the filter is id:ala\-wai\-ns02\-ctd\-data.1.txt
[ WARN] 2019-11-12 04:26:16,078 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-ns02-ctd-data.1.txt to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:390)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.status=? and indextask0_.tryCount<? order by indextask0_.priority asc, indextask0_.taskModifiedDate asc
[ INFO] 2019-11-12 04:26:21,107 [pool-1-thread-1]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, index[ INFO] 2019-11-12 04:26:21,107 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:21,109 [pool-1-thread-4]  (orgtask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCou.dataone.cn.indexer.AbstractStubMergingSubprocent56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.status=? and indextask0_.nextExecution<? and indextask0_.tryCount<?ss
or:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@563d6392
[ INFO] 2019-11-12 04:26:21,110 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing.eml.1.xml
[ INFO] 2019-11-12 04:26:21,107 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:21,111 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@350a6da
[ INFO] 2019-11-12 04:26:21,111 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:21,112 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6f6f068a
[ INFO] 2019-11-12 04:26:21,112 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:21,108 [poothe fl-1-thilter readis id:a-5] la\- (orgwai\-canal\-ns02\-.datmatlab\-procesaonsinge.cn.2.rdf
.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:21,113 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@38e7dd07
[ INFO] 2019-11-12 04:26:21,113 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg
[ INFO] 2019-11-12 04:26:21,108 [pool-1-thread-3]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:21,114 [pool-1-thread-3]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@2b871cf5
[ INFO] 2019-11-12 04:26:21,114 [pool-1-thread-3]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m
[ INFO] 2019-11-12 04:26:21,114 [pool-1-thread-3]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:21,108 [pool-1-thread-1]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@16954eff
[ INFO] 2019-11-12 04:26:21,116 [pool-1-thread-1]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m
[ WARN] 2019-11-12 04:26:21,115 [TEST-RdfXmlProcessorTest.testInsertProvResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for ala-wai-canal-ns02-matlab-processing.2.rdf to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertProvResourceMap(RdfXmlProcessorTest.java:399)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
[ INFO] 2019-11-12 04:26:21,114 [pool-1-thread-5]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:21,112 [pool-1-thread-4]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
[ INFO] 2019-11-12 04:26:21,112 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing-Configure.1.m
[ INFO] 2019-11-12 04:26:21,124 [pool-1-thread-2]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
[ INFO] 2019-11-12 04:26:21,116 [pool-1-thread-1]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
[ INFO] 2019-11-12 04:26:21,145 [pool-1-thread-3]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFHibernaO] 2019-11-12 04:26:21,145 [pool-1-thread-3]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@ff69210
te: delete from index_task where id=? and version=?
[ INFO] 2019-11-12 04:26:21,146 [pool-1-thread-3]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: testMNodeTier3:2012679267486_common-bmp-doc-example-ฉันกินกระจกได้
[ INFO] 2019-11-12 04:26:21,145 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:21,146 [pool-1-thread-3]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:21,146 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@64a0b84c
[ INFO] 2019-11-12 04:26:21,147 [pool-1-thread-4]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-ns02-ctd-data.1.txt
[ INFO] 2019-11-12 04:26:21,147 [pool-1-thread-4]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.status=? and indextask0_.tryCount<? order by indextask0_.priority asc, indextask0_.taskModifiedDate asc
Hibernate: select indextask0_.id as id56_, indextask0_.dateSysMetaModified as dateSysM2_56_, indextask0_.deleted as deleted56_, indextask0_.formatId as formatId56_, indextask0_.nextExecution as nextExec5_56_, indextask0_.objectPath as objectPath56_, indextask0_.pid as pid56_, indextask0_.priority as priority56_, indextask0_.status as status56_, indextask0_.sysMetadata as sysMeta10_56_, indextask0_.taskModifiedDate as taskMod11_56_, indextask0_.tryCount as tryCount56_, indextask0_.version as version56_ from index_task indextask0_ where indextask0_.status=? and indextask0_.nextExecution<? and indextask0_.tryCount<?
[ INFO] 2019-11-12 04:26:26,146 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:26,146 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@487ee508
[ INFO] 2019-11-12 04:26:26,146 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing.2.rdf
[ INFO] 2019-11-12 04:26:26,340 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 4
[ INFO] 2019-11-12 04:26:26,340 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: null
[ INFO] 2019-11-12 04:26:26,340 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing.eml.1.xml
[ INFO] 2019-11-12 04:26:26,345 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing.eml.1.xml
[ INFO] 2019-11-12 04:26:26,345 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709506170880
[ INFO] 2019-11-12 04:26:26,345 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing-Configure.1.m
[ INFO] 2019-11-12 04:26:26,348 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing-Configure.1.m
[ INFO] 2019-11-12 04:26:26,348 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709514559488
[ INFO] 2019-11-12 04:26:26,348 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m
[ INFO] 2019-11-12 04:26:26,351 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m
[ INFO] 2019-11-12 04:26:26,352 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709517705216
[ INFO] 2019-11-12 04:26:26,352 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m
[ INFO] 2019-11-12 04:26:26,355 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m
[ INFO] 2019-11-12 04:26:26,355 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709492539392
[ INFO] 2019-11-12 04:26:26,412 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 9
[ INFO] 2019-11-12 04:26:26,412 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6bc03da0
[ INFO] 2019-11-12 04:26:26,413 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: ala-wai-canal-ns02-matlab-processing.2.rdf
[ INFO] 2019-11-12 04:26:26,413 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing.eml.1.xml
[ INFO] 2019-11-12 04:26:26,416 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing.eml.1.xml
[ INFO] 2019-11-12 04:26:26,416 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709506170880
[ INFO] 2019-11-12 04:26:26,416 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-ctd-data.1.txt
[ INFO] 2019-11-12 04:26:26,421 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing-Configure.1.m
[ INFO] 2019-11-12 04:26:26,423 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing-Configure.1.m
[ INFO] 2019-11-12 04:26:26,424 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709514559488
[ INFO] 2019-11-12 04:26:26,424 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing.2.rdf#aggregation
[ INFO] 2019-11-12 04:26:26,428 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m
[ INFO] 2019-11-12 04:26:26,431 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing-DataProcessor.1.m
[ INFO] 2019-11-12 04:26:26,431 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709517705216
[ INFO] 2019-11-12 04:26:26,431 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m
[ INFO] 2019-11-12 04:26:26,434 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:135) found existing record for the stub, id = ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m
[ INFO] 2019-11-12 04:26:26,434 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:136)     .... version is: 1649968709492539392
[ INFO] 2019-11-12 04:26:26,434 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: urn:uuid:6EC8CAB7-2063-4440-BA23-364313C145FC
[ INFO] 2019-11-12 04:26:26,439 [pool-1-thread-2]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:129)  other document to process: ala-wai-canal-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg
[ INFO] 2019-11-12 04:26:26,443 [pool-1-thread-2]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Hibernate: select indextask0_.id as id56_0_, indextask0_.dateSysMetaModified as dateSysM2_56_0_, indextask0_.deleted as deleted56_0_, indextask0_.formatId as formatId56_0_, indextask0_.nextExecution as nextExec5_56_0_, indextask0_.objectPath as objectPath56_0_, indextask0_.pid as pid56_0_, indextask0_.priority as priority56_0_, indextask0_.status as status56_0_, indextask0_.sysMetadata as sysMeta10_56_0_, indextask0_.taskModifiedDate as taskMod11_56_0_, indextask0_.tryCount as tryCount56_0_, indextask0_.version as version56_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
[ INFO] 2019-11-12 04:26:41,358 [TEST-RdfXmlProcessorTest.testInsertPartsResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:41,420 [TEST-RdfXmlProcessorTest.testInsertPartsResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:41,428 [TEST-RdfXmlProcessorTest.testInsertPartsResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
the filter is id:urn\:uuid\:f18812ac\-7f4f\-496c\-82cc\-3f4f54830274
[ WARN] 2019-11-12 04:26:41,439 [TEST-RdfXmlProcessorTest.testInsertPartsResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for urn:uuid:f18812ac-7f4f-496c-82cc-3f4f54830274 to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:648)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertPartsResourceMap(RdfXmlProcessorTest.java:441)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
the filter is id:urn\:uuid\:b210adf0\-f08a\-4cae\-aa86\-5b64605e9297
[ WARN] 2019-11-12 04:26:41,457 [TEST-RdfXmlProcessorTest.testInsertPartsResourceMap-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.generator.filter.HZEventFilter:filter:182) HZEventFilter.filter - there was an exception in comparing the solr record for urn:uuid:b210adf0-f08a-4cae-aa86-5b64605e9297 to its system metadata. However, this event still should be granted for indexing for safe.
java.lang.NullPointerException
	at org.dataone.cn.index.generator.filter.HZEventFilter.filter(HZEventFilter.java:115)
	at org.dataone.cn.index.generator.IndexTaskGenerator.processSystemMetaDataUpdate(IndexTaskGenerator.java:92)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.addSystemMetadata(RdfXmlProcessorTest.java:701)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.insertResource(RdfXmlProcessorTest.java:668)
	at org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest.testInsertPartsResourceMap(RdfXmlProcessorTest.java:447)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1627)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:872)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:886)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:845)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:747)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:781)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:792)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:54)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
	at java.lang.Thread.run(Thread.java:748)
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.pid=? and indextask0_.status=?
Hibernate: insert into index_task (id, dateSysMetaModified, deleted, formatId, nextExecution, objectPath, pid, priority, status, sysMetadata, taskModifiedDate, tryCount, version) values (null, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.status=? and indextask0_.tryCount<? order by indextask0_.priority asc, indextask0_.taskModifiedDate asc
[ INFO] 2019-11-12 04:26:46,477 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:46,478 [pool-1-thread-1]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:46,478 [pool-1-thread-1]  (org.dHibernateataone: se.cn.lecindext indextaser.Ak0_.bstractSid tubMeras id58_, igingSndextubproask0_.dacessoteSysMr:proetaModifiecessDocud asment: dateSysM294)  main doc_58_ument,  indextask0_.deleted as from parseDdeletocumeed58nts: org.dataone.cn._, indextasindexer.k0_.formasolrhttId as tp.SolformatId58rDoc@3ed0d_, index59e
task0_.nextExecution as [n IextENFO] 2019-11-12 04:26:46,479 [pool-1-thread-1]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: urn:uuid:b210adf0-f08a-4cae-aa86-5b64605e9297
xec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.status=? and indextask0_.nextExecution<? and indextask0_.tryCount<?
[ INFO] 2019-11-12 04:26:46,478 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@45df1d7b
[ INFO] 2019-11-12 04:26:46,481 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: urn:uuid:f18812ac-7f4f-496c-82cc-3f4f54830274
[ INFO] 2019-11-12 04:26:46,482 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:46,483 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@48ab3373
[ INFO] 2019-11-12 04:26:46,483 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:46,484 [pool-1-thread-1]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
[ INFO] 2019-11-12 04:26:46,489 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:26:46,489 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@700d341
[ INFO] 2019-11-12 04:26:46,489 [pool-1-thread-5]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: null
[ INFO] 2019-11-12 04:26:46,489 [pool-1-thread-5]  (org.dataone.cn.indexer.SolrIndexServiceV2:sendCommand:445) sendCommand using partial update?: false
Hibernate: select indextask0_.id as id58_0_, indextask0_.dateSysMetaModified as dateSysM2_58_0_, indextask0_.deleted as deleted58_0_, indextask0_.formatId as formatId58_0_, indextask0_.nextExecution as nextExec5_58_0_, indextask0_.objectPath as objectPath58_0_, indextask0_.pid as pid58_0_, indextask0_.priority as priority58_0_, indextask0_.status as status58_0_, indextask0_.sysMetadata as sysMeta10_58_0_, indextask0_.taskModifiedDate as taskMod11_58_0_, indextask0_.tryCount as tryCount58_0_, indextask0_.version as version58_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: select indextask0_.id as id58_0_, indextask0_.dateSysMetaModified as dateSysM2_58_0_, indextask0_.deleted as deleted58_0_, indextask0_.formatId as formatId58_0_, indextask0_.nextExecution as nextExec5_58_0_, indextask0_.objectPath as objectPath58_0_, indextask0_.pid as pid58_0_, indextask0_.priority as priority58_0_, indextask0_.status as status58_0_, indextask0_.sysMetadata as sysMeta10_58_0_, indextask0_.taskModifiedDate as taskMod11_58_0_, indextask0_.tryCount as tryCount58_0_, indextask0_.version as version58_0_ from index_task indextask0_ where indextask0_.id=?
Hibernate: delete from index_task where id=? and version=?
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select count(indextask0_.id) as col_0_0_ from index_task indextask0_ where indextask0_.status=? limit ?
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.status=? and indextask0_.tryCount<? order by indextask0_.priority asc, indextask0_.taskModifiedDate asc
Hibernate: select indextask0_.id as id58_, indextask0_.dateSysMetaModified as dateSysM2_58_, indextask0_.deleted as deleted58_, indextask0_.formatId as formatId58_, indextask0_.nextExecution as nextExec5_58_, indextask0_.objectPath as objectPath58_, indextask0_.pid as pid58_, indextask0_.priority as priority58_, indextask0_.status as status58_, indextask0_.sysMetadata as sysMeta10_58_, indextask0_.taskModifiedDate as taskMod11_58_, indextask0_.tryCount as tryCount58_, indextask0_.version as version58_ from index_task indextask0_ where indextask0_.status=? and indextask0_.nextExecution<? and indextask0_.tryCount<?
[ INFO] 2019-11-12 04:26:56,717 [TEST-RdfXmlProcessorTest.testProvenanceFields-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:26:56,785 [TEST-RdfXmlProcessorTest.testProvenanceFields-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
[ WARN] 2019-11-12 04:26:56,794 [TEST-RdfXmlProcessorTest.testProvenanceFields-seed#[90A4EAC4366D4ED4]]  (org.dataone.cn.index.DataONESolrJettyTestBase:startJettyAndSolr:255) Jetty already started...
referencedPid: ala-wai-canal-ns02-ctd-data.1.txt
referencedPid: ala-wai-canal-ns02-image-data-AW02XX_001CTDXXXXR00_20150203_10day.1.jpg
referencedPid: ala-wai-canal-ns02-matlab-processing-schedule_AW02XX_001CTDXXXXR00_processing.1.m
referencedPid: ala-wai-canal-ns02-ctd-data.1.txt
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.492 s - in org.dataone.cn.indexer.resourcemap.RdfXmlProcessorTest
[INFO] Running org.dataone.cn.indexer.resourcemap.OREResourceMapTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.063 s - in org.dataone.cn.indexer.resourcemap.OREResourceMapTest
[INFO] Running org.dataone.cn.indexer.solrhttp.SolrElementFieldTest

                bath gom1k.nc : the GoM bathymetry derived from the GMRT and NGDC databases (recommended); NetCDF-3 classic format (total size 760 MB). topo gom1k.nc : the GoM topography (land and water) derived from the GMRT and NGDC databases (recommended); NetCDF-3 classic format (total size 127 MB). bath gom1k GEBCO.nc : the GoM bathymetry derived from the GEBCO and NGDC databases; NetCDF-3 classic format (total size 760 MB). topo gom1k GEBCO.nc : the GoM topography (land and water) derived from the GEBCO and NGDC databases; NetCDF-3 classic format (total size 127 MB). bath guide.pdf 􀀀> The manual of the developed GoM 0:01 o bathymetry Various renders of topography and bathymetry|||||
            
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.dataone.cn.indexer.solrhttp.SolrElementFieldTest
[INFO] Running org.dataone.cn.indexer.solrhttp.SolrSchemaBeanConfigTest
[ INFO] 2019-11-12 04:26:57,171 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
internal
   * id
   * _version_
ore
   * resourceMap
   * documents
   * isDocumentedBy
   * hasPart
   * isPartOf
collections
   * label
   * logo
   * collectionQuery
mn_service
   * isService
   * serviceCoupling
   * serviceTitle
   * serviceDescription
   * serviceType
   * serviceEndpoint
   * serviceInput
   * serviceOutput
scimeta
   * author
   * authorSurName
   * authorGivenName
   * authorSurNameSort
   * authorGivenNameSort
   * authorLastName
   * abstract
   * keywords
   * keyConcept
   * southBoundCoord
   * northBoundCoord
   * westBoundCoord
   * eastBoundCoord
   * namedLocation
   * beginDate
   * endDate
   * title
   * scientificName
   * relatedOrganizations
   * datePublished
   * pubDate
   * investigator
   * investigatorText
   * ogcUrl
   * sku
   * LTERSite
   * origin
   * originText
   * titlestr
   * geoform
   * presentationCat
   * purpose
   * updateDate
   * edition
   * originator
   * originatorText
   * family
   * species
   * genus
   * kingdom
   * phylum
   * order
   * class
   * attributeName
   * attributeLabel
   * attributeDescription
   * attributeUnit
   * attribute
   * webUrl
   * contactOrganization
   * contactOrganizationText
   * keywordsText
   * placeKey
   * noBoundingBox
   * isSpatial
   * decade
   * gcmdKeyword
   * project
   * projectText
   * site
   * siteText
   * parameter
   * parameterText
   * sensor
   * sensorText
   * source
   * sourceText
   * term
   * termText
   * topic
   * topicText
   * fileID
   * text
   * geohash_1
   * geohash_2
   * geohash_3
   * geohash_4
   * geohash_5
   * geohash_6
   * geohash_7
   * geohash_8
   * geohash_9
   * funding
   * funderName
   * funderIdentifier
   * awardNumber
   * awardTitle
sem
   * sem_annotation
   * sem_annotated_by
   * sem_annotates
   * sem_comment
sysmeta
   * identifier
   * seriesId
   * fileName
   * mediaType
   * mediaTypeProperty
   * formatId
   * formatType
   * size
   * checksum
   * checksumAlgorithm
   * dateUploaded
   * dateModified
   * submitter
   * rightsHolder
   * authoritativeMN
   * replicationAllowed
   * numberReplicas
   * preferredReplicationMN
   * blockedReplicationMN
   * replicaMN
   * replicaVerifiedDate
   * replicationStatus
   * datasource
   * obsoletes
   * obsoletedBy
   * readPermission
   * writePermission
   * changePermission
   * isPublic
   * dataUrl
prov
   * prov_wasDerivedFrom
   * prov_wasInformedBy
   * prov_used
   * prov_generated
   * prov_generatedByProgram
   * prov_generatedByExecution
   * prov_generatedByUser
   * prov_usedByProgram
   * prov_usedByExecution
   * prov_usedByUser
   * prov_wasExecutedByExecution
   * prov_wasExecutedByUser
   * prov_hasSources
   * prov_hasDerivations
   * prov_instanceOfClass
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.026 s - in org.dataone.cn.indexer.solrhttp.SolrSchemaBeanConfigTest
[INFO] Running org.dataone.cn.indexer.AppTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 s - in org.dataone.cn.indexer.AppTest
[INFO] Running org.dataone.cn.indexer.processor.TestIndexTaskProcessorConcurrency
============= scheduler START ==============Tue Nov 12 04:26:57 UTC 2019
instantiating MockIndexTaskProcessorJob: 
MockIndexTaskProcessorJob: entering execute... 
       Submitted task 0
Starting task: 0
       Submitted task 1
Starting task: 1
       Submitted task 2
Starting task: 2
       Submitted task 3
       Submitted task 4
       Submitted task 5
       Submitted task 6
       Submitted task 7
       Submitted task 8
       Submitted task 9
       Submitted task 10
       Submitted task 11
       Submitted task 12
       Submitted task 13
       Submitted task 14
       Submitted task 15
       Submitted task 16
       Submitted task 17
       Submitted task 18
       Submitted task 19
       Submitted task 20
       Submitted task 21
       Submitted task 22
       Submitted task 23
       Submitted task 24
       Submitted task 25
       Submitted task 26
       Submitted task 27
       Submitted task 28
       Submitted task 29
       Submitted task 30
       Submitted task 31
       Submitted task 32
       Submitted task 33
       Submitted task 34
       Submitted task 35
       Submitted task 36
       Submitted task 37
       Submitted task 38
       Submitted task 39
       Submitted task 40
       Submitted task 41
       Submitted task 42
       Submitted task 43
       Submitted task 44
       Submitted task 45
       Submitted task 46
       Submitted task 47
       Submitted task 48
       Submitted task 49
       Submitted task 50
       Submitted task 51
       Submitted task 52
       Submitted task 53
       Submitted task 54
       Submitted task 55
       Submitted task 56
       Submitted task 57
       Submitted task 58
       Submitted task 59
       Submitted task 60
       Submitted task 61
       Submitted task 62
       Submitted task 63
       Submitted task 64
       Submitted task 65
       Submitted task 66
       Submitted task 67
       Submitted task 68
       Submitted task 69
       Submitted task 70
       Submitted task 71
       Submitted task 72
       Submitted task 73
       Submitted task 74
       Submitted task 75
       Submitted task 76
       Submitted task 77
       Submitted task 78
       Submitted task 79
       Submitted task 80
       Submitted task 81
       Submitted task 82
       Submitted task 83
       Submitted task 84
       Submitted task 85
       Submitted task 86
       Submitted task 87
       Submitted task 88
       Submitted task 89
       Submitted task 90
       Submitted task 91
       Submitted task 92
       Submitted task 93
       Submitted task 94
       Submitted task 95
       Submitted task 96
       Submitted task 97
       Submitted task 98
       Submitted task 99
MockIndexTaskProcessorJob...finished execution in (millis) 4
after 3000 millis, finishing task: 0
after 3000 millis, finishing task: 2
after 3000 millis, finishing task: 1
Starting task: 4
Starting task: 3
Starting task: 5
============= attempt to kill the Job ==============Tue Nov 12 04:27:02 UTC 2019
********************* ProcessorJob interrupt called, calling executorservice shutdown...
Job scheduler finish executing all jobs.
============= continue to wait 5 sec... ==============Tue Nov 12 04:27:02 UTC 2019
after 3000 millis, finishing task: 3
after 3000 millis, finishing task: 5
after 3000 millis, finishing task: 4
Starting task: 7
Starting task: 6
Starting task: 8
after 3000 millis, finishing task: 7
after 3000 millis, finishing task: 8
after 3000 millis, finishing task: 6
Starting task: 10
Starting task: 9
Starting task: 11
============= scheduler SHUTDOWN ==============Tue Nov 12 04:27:07 UTC 2019
after 3000 millis, finishing task: 9
after 3000 millis, finishing task: 10
after 3000 millis, finishing task: 11
Starting task: 13
Starting task: 12
Starting task: 14
after 3000 millis, finishing task: 13
after 3000 millis, finishing task: 14
after 3000 millis, finishing task: 12
Starting task: 16
Starting task: 15
Starting task: 17
after 3000 millis, finishing task: 15
after 3000 millis, finishing task: 17
after 3000 millis, finishing task: 16
Starting task: 19
Starting task: 18
Starting task: 20
after 3000 millis, finishing task: 19
after 3000 millis, finishing task: 20
after 3000 millis, finishing task: 18
Starting task: 22
Starting task: 21
Starting task: 23
after 3000 millis, finishing task: 21
after 3000 millis, finishing task: 23
after 3000 millis, finishing task: 22
Starting task: 25
Starting task: 24
Starting task: 26
after 3000 millis, finishing task: 25
after 3000 millis, finishing task: 26
Starting task: 28
after 3000 millis, finishing task: 24
Starting task: 27
Starting task: 29
after 3000 millis, finishing task: 28
after 3000 millis, finishing task: 27
Starting task: 30
after 3000 millis, finishing task: 29
Starting task: 31
Starting task: 32
after 3000 millis, finishing task: 30
after 3000 millis, finishing task: 31
Starting task: 33
after 3000 millis, finishing task: 32
Starting task: 34
Starting task: 35
after 3000 millis, finishing task: 34
after 3000 millis, finishing task: 33
after 3000 millis, finishing task: 35
Starting task: 37
Starting task: 36
Starting task: 38
after 3000 millis, finishing task: 37
after 3000 millis, finishing task: 38
after 3000 millis, finishing task: 36
Starting task: 40
Starting task: 41
Starting task: 39
============= DONE !!!!!!!!!! ==============Tue Nov 12 04:27:37 UTC 2019
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.049 s - in org.dataone.cn.indexer.processor.TestIndexTaskProcessorConcurrency
[INFO] Running org.dataone.cn.indexer.processor.QueuePrioritizerTest
 0. 5.00 5: added: A
 1. 5.00 5: added: A
 2. 5.00 5: added: A
 3. 5.00 5: added: A
 4. 5.00 5: added: A
 5. 5.00 5: added: A
 6. 5.00 5: added: A
 7. 5.00 5: added: A
 8. 5.00 5: added: A
 9. 5.00 5: added: A
10. 5.00 5: added: A
11. 5.00 5: added: A
12. 5.00 5: added: A
13. 5.00 5: added: A
14. 5.00 5: added: A
15. 5.00 5: added: A
16. 5.00 5: added: A
17. 5.00 5: added: A
18. 5.00 5: added: A
19. 5.00 5: added: A
20. 5.00 5: added: A
21. 5.00 5: added: A
22. 5.00 5: added: A
23. 5.00 5: added: A
24. 5.00 5: added: A
25. 5.00 5: added: A
26. 5.00 5: added: A
27. 5.00 5: added: A
28. 5.00 5: added: A
29. 5.00 5: added: A
30. 5.00 5: added: BB
31. 5.00 5: added: BB
32. 5.00 5: added: BB
33. 5.00 5: added: BB
34. 5.00 5: added: BB
35. 5.00 5: added: BB
36. 5.00 5: added: BB
37. 5.00 5: added: BB
38. 5.00 5: added: BB
39. 5.00 5: added: BB
40. 5.00 5: added: BB
41. 5.00 5: added: BB
42. 5.00 5: added: BB
43. 5.00 5: added: BB
44. 5.00 5: added: BB
45. 5.00 5: added: BB
46. 5.00 5: added: BB
47. 5.00 5: added: BB
48. 5.00 5: added: BB
49. 4.00 4: added: BB
50. 3.94 3: added: BB
51. 3.88 3: added: BB
52. 3.83 3: added: BB
53. 3.78 3: added: BB
54. 3.73 3: added: BB
55. 3.68 3: added: BB
56. 3.63 3: added: BB
57. 3.59 3: added: BB
58. 3.54 3: added: BB
59. 3.50 3: added: BB
60. 5.92 5: added: CCC
61. 5.84 5: added: CCC
62. 5.76 5: added: CCC
63. 5.69 5: added: CCC
64. 5.62 5: added: CCC
65. 5.55 5: added: CCC
66. 5.48 5: added: CCC
67. 5.41 5: added: CCC
68. 5.35 5: added: CCC
69. 5.29 5: added: CCC
70. 5.23 5: added: CCC
71. 5.17 5: added: CCC
72. 5.11 5: added: CCC
73. 5.05 5: added: CCC
74. 5.00 5: added: CCC
75. 4.95 4: added: CCC
76. 4.90 4: added: CCC
77. 4.85 4: added: CCC
78. 4.80 4: added: CCC
79. 4.75 4: added: CCC
80. 4.70 4: added: CCC
81. 4.66 4: added: CCC
82. 4.61 4: added: CCC
83. 4.57 4: added: CCC
84. 4.53 4: added: CCC
85. 4.49 4: added: CCC
86. 4.45 4: added: CCC
87. 4.41 4: added: CCC
88. 4.37 4: added: CCC
89. 4.33 4: added: CCC
 0. 3.00 3: added: A
 1. 3.00 3: added: BB
 2. 3.00 3: added: A
 3. 3.00 3: added: BB
 4. 3.00 3: added: A
 5. 3.00 3: added: BB
 6. 3.00 3: added: A
 7. 3.00 3: added: BB
 8. 3.00 3: added: A
 9. 3.00 3: added: BB
10. 3.00 3: added: A
11. 3.00 3: added: BB
12. 3.00 3: added: A
13. 3.00 3: added: BB
14. 3.00 3: added: A
15. 3.00 3: added: BB
16. 3.00 3: added: A
17. 3.00 3: added: BB
18. 3.00 3: added: A
19. 3.00 3: added: BB
20. 3.00 3: added: A
21. 3.00 3: added: BB
22. 3.00 3: added: A
23. 3.00 3: added: BB
24. 2.44 2: added: A
25. 2.50 2: added: BB
26. 2.44 2: added: A
27. 2.50 2: added: BB
28. 2.45 2: added: A
29. 2.50 2: added: BB
30. 2.45 2: added: A
31. 2.50 2: added: BB
32. 2.45 2: added: A
33. 2.50 2: added: BB
34. 2.46 2: added: A
35. 2.50 2: added: BB
36. 2.46 2: added: A
37. 2.50 2: added: BB
38. 2.46 2: added: A
39. 2.50 2: added: BB
40. 2.46 2: added: A
41. 2.50 2: added: BB
42. 2.47 2: added: A
43. 2.50 2: added: BB
44. 2.47 2: added: A
45. 2.50 2: added: BB
46. 2.47 2: added: A
47. 2.50 2: added: BB
48. 2.47 2: added: A
49. 2.50 2: added: BB
50. 2.50 2: added: A
51. 2.50 2: added: BB
52. 2.50 2: added: A
53. 2.50 2: added: BB
54. 2.50 2: added: A
55. 2.50 2: added: BB
56. 2.50 2: added: A
57. 2.50 2: added: BB
58. 2.50 2: added: A
59. 2.50 2: added: BB
60. 2.50 2: added: A
61. 2.50 2: added: BB
62. 2.50 2: added: A
63. 2.50 2: added: BB
64. 2.50 2: added: A
65. 2.50 2: added: BB
66. 2.50 2: added: A
67. 2.50 2: added: BB
68. 2.50 2: added: A
69. 2.50 2: added: BB
70. 2.50 2: added: A
71. 2.50 2: added: BB
72. 2.50 2: added: A
73. 2.50 2: added: BB
74. 2.50 2: added: A
75. 2.50 2: added: BB
76. 2.50 2: added: A
77. 2.50 2: added: BB
78. 2.50 2: added: A
79. 2.50 2: added: BB
80. 2.50 2: added: A
81. 2.50 2: added: BB
82. 2.50 2: added: A
83. 2.50 2: added: BB
84. 2.50 2: added: A
85. 2.50 2: added: BB
86. 2.50 2: added: A
87. 2.50 2: added: BB
88. 2.50 2: added: A
89. 2.50 2: added: BB
90. 2.50 2: added: A
91. 2.50 2: added: BB
92. 2.50 2: added: A
93. 2.50 2: added: BB
94. 2.50 2: added: A
95. 2.50 2: added: BB
96. 2.50 2: added: A
97. 2.50 2: added: BB
98. 2.50 2: added: A
99. 2.50 2: added: BB
100. 2.50 2: added: A
101. 2.50 2: added: BB
102. 2.50 2: added: A
103. 2.50 2: added: BB
104. 2.50 2: added: A
105. 2.50 2: added: BB
106. 2.50 2: added: A
107. 2.50 2: added: BB
108. 2.50 2: added: A
109. 2.50 2: added: BB
110. 2.50 2: added: A
111. 2.50 2: added: BB
112. 2.50 2: added: A
113. 2.50 2: added: BB
114. 2.50 2: added: A
115. 2.50 2: added: BB
116. 2.50 2: added: A
117. 2.50 2: added: BB
118. 2.50 2: added: A
119. 2.50 2: added: BB
120. 2.50 2: added: A
121. 2.50 2: added: BB
122. 2.50 2: added: A
123. 2.50 2: added: BB
124. 2.50 2: added: A
125. 2.50 2: added: BB
126. 2.50 2: added: A
127. 2.50 2: added: BB
128. 2.50 2: added: A
129. 2.50 2: added: BB
130. 2.50 2: added: A
131. 2.50 2: added: BB
132. 2.50 2: added: A
133. 2.50 2: added: BB
134. 2.50 2: added: A
135. 2.50 2: added: BB
136. 2.50 2: added: A
137. 2.50 2: added: BB
138. 2.50 2: added: A
139. 2.50 2: added: BB
140. 2.50 2: added: A
141. 2.50 2: added: BB
142. 2.50 2: added: A
143. 2.50 2: added: BB
144. 2.50 2: added: A
145. 2.50 2: added: BB
146. 2.50 2: added: A
147. 2.50 2: added: BB
148. 2.50 2: added: A
149. 2.50 2: added: BB
150. 2.50 2: added: A
151. 2.50 2: added: BB
152. 2.50 2: added: A
153. 2.50 2: added: BB
154. 2.50 2: added: A
155. 2.50 2: added: BB
156. 2.50 2: added: A
157. 2.50 2: added: BB
158. 2.50 2: added: A
159. 2.50 2: added: BB
160. 2.50 2: added: A
161. 2.50 2: added: BB
162. 2.50 2: added: A
163. 2.50 2: added: BB
164. 2.50 2: added: A
165. 2.50 2: added: BB
166. 2.50 2: added: A
167. 2.50 2: added: BB
168. 2.50 2: added: A
169. 2.50 2: added: BB
170. 2.50 2: added: A
171. 2.50 2: added: BB
172. 2.50 2: added: A
173. 2.50 2: added: BB
174. 2.50 2: added: A
175. 2.50 2: added: BB
176. 2.50 2: added: A
177. 2.50 2: added: BB
178. 2.50 2: added: A
179. 2.50 2: added: BB
180. 2.50 2: added: A
181. 2.50 2: added: BB
182. 2.50 2: added: A
183. 2.50 2: added: BB
184. 2.50 2: added: A
185. 2.50 2: added: BB
186. 2.50 2: added: A
187. 2.50 2: added: BB
188. 2.50 2: added: A
189. 2.50 2: added: BB
190. 2.50 2: added: A
191. 2.50 2: added: BB
192. 2.50 2: added: A
193. 2.50 2: added: BB
194. 2.50 2: added: A
195. 2.50 2: added: BB
196. 2.50 2: added: A
197. 2.50 2: added: BB
198. 2.50 2: added: A
199. 2.50 2: added: BB
200. 3.94 3: added: CCC
201. 2.50 2: added: BB
202. 3.88 3: added: CCC
203. 2.50 2: added: BB
204. 3.82 3: added: CCC
205. 2.50 2: added: BB
206. 3.76 3: added: CCC
207. 2.50 2: added: BB
208. 3.70 3: added: CCC
209. 2.50 2: added: BB
210. 3.64 3: added: CCC
211. 2.50 2: added: BB
212. 3.58 3: added: CCC
213. 2.50 2: added: BB
214. 3.52 3: added: CCC
215. 2.50 2: added: BB
216. 3.46 3: added: CCC
217. 2.50 2: added: BB
218. 3.40 3: added: CCC
219. 2.50 2: added: BB
220. 3.34 3: added: CCC
221. 2.50 2: added: BB
222. 3.28 3: added: CCC
223. 2.50 2: added: BB
224. 3.22 3: added: CCC
225. 2.50 2: added: BB
226. 3.16 3: added: CCC
227. 2.50 2: added: BB
228. 3.10 3: added: CCC
229. 2.50 2: added: BB
230. 3.04 3: added: CCC
231. 2.50 2: added: BB
232. 2.98 2: added: CCC
233. 2.50 2: added: BB
234. 2.92 2: added: CCC
235. 2.50 2: added: BB
236. 2.86 2: added: CCC
237. 2.50 2: added: BB
238. 2.80 2: added: CCC
239. 2.50 2: added: BB
240. 2.74 2: added: CCC
241. 2.50 2: added: BB
242. 2.68 2: added: CCC
243. 2.50 2: added: BB
244. 2.62 2: added: CCC
245. 2.50 2: added: BB
246. 2.56 2: added: CCC
247. 2.50 2: added: BB
248. 2.50 2: added: CCC
249. 2.50 2: added: BB
 0. 2.00 2: added: A
 1. 2.00 2: added: A
 2. 2.00 2: added: A
 3. 2.00 2: added: A
 4. 2.00 2: added: A
 5. 2.00 2: added: A
 6. 2.00 2: added: A
 7. 2.00 2: added: A
 8. 2.00 2: added: A
 9. 2.00 2: added: A
10. 2.00 2: added: A
11. 2.00 2: added: A
12. 2.00 2: added: A
13. 2.00 2: added: A
14. 2.00 2: added: A
15. 2.00 2: added: A
16. 2.00 2: added: A
17. 2.00 2: added: A
18. 2.00 2: added: A
19. 2.00 2: added: A
20. 2.00 2: added: A
21. 2.00 2: added: A
22. 2.00 2: added: A
23. 2.00 2: added: A
24. 2.00 2: added: A
25. 2.00 2: added: A
26. 2.00 2: added: A
27. 2.00 2: added: A
28. 2.00 2: added: A
29. 2.00 2: added: A
30. 2.00 2: added: A
31. 2.00 2: added: A
32. 2.00 2: added: A
33. 2.00 2: added: A
34. 2.00 2: added: A
35. 2.00 2: added: A
36. 2.00 2: added: A
37. 2.00 2: added: A
38. 2.00 2: added: A
39. 2.00 2: added: A
40. 2.00 2: added: A
41. 2.00 2: added: A
42. 2.00 2: added: A
43. 2.00 2: added: A
44. 2.00 2: added: A
45. 2.00 2: added: A
46. 2.00 2: added: A
47. 2.00 2: added: A
48. 2.00 2: added: A
49. 1.00 1: added: A
50. 1.00 1: added: A
51. 1.00 1: added: A
52. 1.00 1: added: A
53. 1.00 1: added: A
54. 1.00 1: added: A
55. 1.00 1: added: A
56. 1.00 1: added: A
57. 1.00 1: added: A
58. 1.00 1: added: A
59. 1.00 1: added: A
60. 1.00 1: added: A
61. 1.00 1: added: A
62. 1.00 1: added: A
63. 1.00 1: added: A
64. 1.00 1: added: A
65. 1.00 1: added: A
66. 1.00 1: added: A
67. 1.00 1: added: A
68. 1.00 1: added: A
69. 1.00 1: added: A
70. 1.00 1: added: A
71. 1.00 1: added: A
72. 1.00 1: added: A
73. 1.00 1: added: A
74. 1.00 1: added: A
75. 1.00 1: added: A
76. 1.00 1: added: A
77. 1.00 1: added: A
78. 1.00 1: added: A
79. 1.00 1: added: A
80. 1.00 1: added: A
81. 1.00 1: added: A
82. 1.00 1: added: A
83. 1.00 1: added: A
84. 1.00 1: added: A
85. 1.00 1: added: A
86. 1.00 1: added: A
87. 1.00 1: added: A
88. 1.00 1: added: A
89. 1.00 1: added: A
90. 1.00 1: added: A
91. 1.00 1: added: A
92. 1.00 1: added: A
93. 1.00 1: added: A
94. 1.00 1: added: A
95. 1.00 1: added: A
96. 1.00 1: added: A
97. 1.00 1: added: A
98. 1.00 1: added: A
99. 1.00 1: added: A
100. 1.00 1: added: A
101. 1.00 1: added: A
102. 1.00 1: added: A
103. 1.00 1: added: A
104. 1.00 1: added: A
105. 1.00 1: added: A
106. 1.00 1: added: A
107. 1.00 1: added: A
108. 1.00 1: added: A
109. 1.00 1: added: A
110. 1.00 1: added: A
111. 1.00 1: added: A
112. 1.00 1: added: A
113. 1.00 1: added: A
114. 1.00 1: added: A
115. 1.00 1: added: A
116. 1.00 1: added: A
117. 1.00 1: added: A
118. 1.00 1: added: A
119. 1.00 1: added: A
120. 1.00 1: added: A
121. 1.00 1: added: A
122. 1.00 1: added: A
123. 1.00 1: added: A
124. 1.00 1: added: A
125. 1.00 1: added: A
126. 1.00 1: added: A
127. 1.00 1: added: A
128. 1.00 1: added: A
129. 1.00 1: added: A
130. 1.00 1: added: A
131. 1.00 1: added: A
132. 1.00 1: added: A
133. 1.00 1: added: A
134. 1.00 1: added: A
135. 1.00 1: added: A
136. 1.00 1: added: A
137. 1.00 1: added: A
138. 1.00 1: added: A
139. 1.00 1: added: A
140. 1.00 1: added: A
141. 1.00 1: added: A
142. 1.00 1: added: A
143. 1.00 1: added: A
144. 1.00 1: added: A
145. 1.00 1: added: A
146. 1.00 1: added: A
147. 1.00 1: added: A
148. 1.00 1: added: A
149. 1.00 1: added: A
150. 1.00 1: added: A
151. 1.00 1: added: A
152. 1.00 1: added: A
153. 1.00 1: added: A
154. 1.00 1: added: A
155. 1.00 1: added: A
156. 1.00 1: added: A
157. 1.00 1: added: A
158. 1.00 1: added: A
159. 1.00 1: added: A
160. 1.00 1: added: A
161. 1.00 1: added: A
162. 1.00 1: added: A
163. 1.00 1: added: A
164. 1.00 1: added: A
165. 1.00 1: added: A
166. 1.00 1: added: A
167. 1.00 1: added: A
168. 1.00 1: added: A
169. 1.00 1: added: A
170. 1.00 1: added: A
171. 1.00 1: added: A
172. 1.00 1: added: A
173. 1.00 1: added: A
174. 1.00 1: added: A
175. 1.00 1: added: A
176. 1.00 1: added: A
177. 1.00 1: added: A
178. 1.00 1: added: A
179. 1.00 1: added: A
180. 1.00 1: added: A
181. 1.00 1: added: A
182. 1.00 1: added: A
183. 1.00 1: added: A
184. 1.00 1: added: A
185. 1.00 1: added: A
186. 1.00 1: added: A
187. 1.00 1: added: A
188. 1.00 1: added: A
189. 1.00 1: added: A
190. 1.00 1: added: A
191. 1.00 1: added: A
192. 1.00 1: added: A
193. 1.00 1: added: A
194. 1.00 1: added: A
195. 1.00 1: added: A
196. 1.00 1: added: A
197. 1.00 1: added: A
198. 1.00 1: added: A
199. 1.00 1: added: A
200. 2.98 2: added: B
201. 1.02 1: added: A
202. 2.96 2: added: B
203. 1.04 1: added: A
204. 2.94 2: added: B
205. 1.06 1: added: A
206. 2.92 2: added: B
207. 1.08 1: added: A
208. 2.90 2: added: B
209. 1.10 1: added: A
210. 2.88 2: added: B
211. 1.12 1: added: A
212. 2.86 2: added: B
213. 1.14 1: added: A
214. 2.84 2: added: B
215. 1.16 1: added: A
216. 2.82 2: added: B
217. 1.18 1: added: A
218. 2.80 2: added: B
219. 1.20 1: added: A
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.029 s - in org.dataone.cn.indexer.processor.QueuePrioritizerTest
[INFO] Running org.dataone.cn.indexer.processor.ResourceMapSubprocessorTest
hello?
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 s - in org.dataone.cn.indexer.processor.ResourceMapSubprocessorTest
[INFO] Running org.dataone.cn.indexer.processor.ProcessorShutdownTest
Callable 1 is sleeping... 1573532857295
Callable 2 is sleeping... 1573532857315
Callable 3 is sleeping... 1573532857336
Callable 4 is sleeping... 1573532857356
Callable 5 is sleeping... 1573532857376
Callable 6 is sleeping... 1573532857396
Callable 7 is sleeping... 1573532857417
Callable 8 is sleeping... 1573532857437
Callable 9 is sleeping... 1573532857457
Callable 10 is sleeping... 1573532857477
Shutting down the executor service... 1573532858281
Try to submit more tasks to shutdown executor... 1573532858282
Exception from executor service while trying to submit tasks to a shutdown executor
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@68183f19 rejected from java.util.concurrent.ThreadPoolExecutor@3ceb8e1f[Shutting down, pool size = 10, active threads = 10, queued tasks = 40, completed tasks = 0]
	at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
	at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
	at org.dataone.cn.indexer.processor.ProcessorShutdownTest.testShutdown(ProcessorShutdownTest.java:41)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:272)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:236)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:386)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:323)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:143)
Canceling cancelable tasks... 1573532858285
Task 1 is NOT done. 1573532858285
Task 1 successfully canceled. 1573532858285
Task 2 is NOT done. 1573532858285
Task 2 successfully canceled. 1573532858285
Task 3 is NOT done. 1573532858285
Task 3 successfully canceled. 1573532858285
Task 4 is NOT done. 1573532858285
Task 4 successfully canceled. 1573532858285
Task 5 is NOT done. 1573532858285
Task 5 successfully canceled. 1573532858285
Task 6 is NOT done. 1573532858286
Task 6 successfully canceled. 1573532858286
Task 7 is NOT done. 1573532858286
Task 7 successfully canceled. 1573532858286
Task 8 is NOT done. 1573532858286
Task 8 successfully canceled. 1573532858286
Task 9 is NOT done. 1573532858286
Task 9 successfully canceled. 1573532858286
Task 10 is NOT done. 1573532858286
Task 10 successfully canceled. 1573532858286
Task 11 is NOT done. 1573532858286
Task 11 successfully canceled. 1573532858286
Task 12 is NOT done. 1573532858286
Task 12 successfully canceled. 1573532858286
Task 13 is NOT done. 1573532858286
Task 13 successfully canceled. 1573532858286
Task 14 is NOT done. 1573532858286
Task 14 successfully canceled. 1573532858286
Task 15 is NOT done. 1573532858286
Task 15 successfully canceled. 1573532858286
Task 16 is NOT done. 1573532858286
Task 16 successfully canceled. 1573532858286
Task 17 is NOT done. 1573532858286
Task 17 successfully canceled. 1573532858286
Task 18 is NOT done. 1573532858286
Task 18 successfully canceled. 1573532858286
Task 19 is NOT done. 1573532858286
Task 19 successfully canceled. 1573532858287
Task 20 is NOT done. 1573532858287
Task 20 successfully canceled. 1573532858287
Task 21 is NOT done. 1573532858287
Task 21 successfully canceled. 1573532858287
Task 22 is NOT done. 1573532858287
Task 22 successfully canceled. 1573532858287
Task 23 is NOT done. 1573532858287
Task 23 successfully canceled. 1573532858287
Task 24 is NOT done. 1573532858287
Task 24 successfully canceled. 1573532858287
Task 25 is NOT done. 1573532858287
Task 25 successfully canceled. 1573532858287
Task 26 is NOT done. 1573532858287
Task 26 successfully canceled. 1573532858287
Task 27 is NOT done. 1573532858287
Task 27 successfully canceled. 1573532858287
Task 28 is NOT done. 1573532858287
Task 28 successfully canceled. 1573532858287
Task 29 is NOT done. 1573532858287
Task 29 successfully canceled. 1573532858287
Task 30 is NOT done. 1573532858287
Task 30 successfully canceled. 1573532858287
Task 31 is NOT done. 1573532858287
Task 31 successfully canceled. 1573532858287
Task 32 is NOT done. 1573532858287
Task 32 successfully canceled. 1573532858287
Task 33 is NOT done. 1573532858288
Task 33 successfully canceled. 1573532858288
Task 34 is NOT done. 1573532858288
Task 34 successfully canceled. 1573532858288
Task 35 is NOT done. 1573532858288
Task 35 successfully canceled. 1573532858288
Task 36 is NOT done. 1573532858288
Task 36 successfully canceled. 1573532858288
Task 37 is NOT done. 1573532858288
Task 37 successfully canceled. 1573532858288
Task 38 is NOT done. 1573532858288
Task 38 successfully canceled. 1573532858288
Task 39 is NOT done. 1573532858288
Task 39 successfully canceled. 1573532858288
Task 40 is NOT done. 1573532858288
Task 40 successfully canceled. 1573532858288
Task 41 is NOT done. 1573532858288
Task 41 successfully canceled. 1573532858288
Task 42 is NOT done. 1573532858288
Task 42 successfully canceled. 1573532858288
Task 43 is NOT done. 1573532858288
Task 43 successfully canceled. 1573532858288
Task 44 is NOT done. 1573532858288
Task 44 successfully canceled. 1573532858288
Task 45 is NOT done. 1573532858288
Task 45 successfully canceled. 1573532858289
Task 46 is NOT done. 1573532858289
Task 46 successfully canceled. 1573532858289
Task 47 is NOT done. 1573532858289
Task 47 successfully canceled. 1573532858289
Task 48 is NOT done. 1573532858289
Task 48 successfully canceled. 1573532858289
Task 49 is NOT done. 1573532858289
Task 49 successfully canceled. 1573532858289
Task 50 is NOT done. 1573532858289
Task 50 successfully canceled. 1573532858289
Sleeping 4000ms... 1573532858289
after 3000 millis, finishing task: 40
after 3000 millis, finishing task: 39
after 3000 millis, finishing task: 41
Starting task: 43
Starting task: 42
Starting task: 44
Callable 1 is done. 1573532859310
Callable 2 is done. 1573532859330
Callable 3 is done. 1573532859353
Callable 4 is done. 1573532859370
Callable 5 is done. 1573532859390
Callable 6 is done. 1573532859411
Callable 7 is done. 1573532859431
Callable 8 is done. 1573532859451
Callable 9 is done. 1573532859471
Callable 10 is done. 1573532859491
after 3000 millis, finishing task: 42
after 3000 millis, finishing task: 43
after 3000 millis, finishing task: 44
Starting task: 45
Starting task: 46
Starting task: 47
Starting a hard shutdown of executor... 1573532862289
Done. 1573532862289
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.014 s - in org.dataone.cn.indexer.processor.ProcessorShutdownTest
[INFO] 
[INFO] Results:
[INFO] 
[WARNING] Tests run: 109, Failures: 0, Errors: 0, Skipped: 11
[INFO] 
[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ d1_cn_index_processor ---
[INFO] Building jar: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/d1_cn_index_processor-2.4.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-shade-plugin:1.7.1:shade (default) @ d1_cn_index_processor ---
[INFO] Including xerces:xercesImpl:jar:2.9.1 in the shaded jar.
[INFO] Including xml-apis:xml-apis:jar:1.3.04 in the shaded jar.
[INFO] Including org.apache.solr:solr-solrj:jar:5.2.1 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.1 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpmime:jar:4.4.1 in the shaded jar.
[INFO] Including org.apache.zookeeper:zookeeper:jar:3.4.6 in the shaded jar.
[INFO] Including org.codehaus.woodstox:stax2-api:jar:3.1.4 in the shaded jar.
[INFO] Including org.codehaus.woodstox:woodstox-core-asl:jar:4.4.1 in the shaded jar.
[INFO] Including org.noggit:noggit:jar:0.6 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient:jar:4.3.3 in the shaded jar.
[INFO] Including commons-configuration:commons-configuration:jar:1.6 in the shaded jar.
[INFO] Including commons-fileupload:commons-fileupload:jar:1.2.1 in the shaded jar.
[INFO] Including commons-lang:commons-lang:jar:2.6 in the shaded jar.
[INFO] Including dom4j:dom4j:jar:1.6.1 in the shaded jar.
[INFO] Including joda-time:joda-time:jar:2.2 in the shaded jar.
[INFO] Including log4j:log4j:jar:1.2.17 in the shaded jar.
[INFO] Including org.ow2.asm:asm:jar:4.1 in the shaded jar.
[INFO] Including commons-beanutils:commons-beanutils:jar:1.8.3 in the shaded jar.
[INFO] Including commons-codec:commons-codec:jar:1.10 in the shaded jar.
[INFO] Including org.dataone:d1_cn_common:jar:2.4.0-SNAPSHOT in the shaded jar.
[INFO] Including org.dataone:d1_common_java:jar:2.4.0-SNAPSHOT in the shaded jar.
[INFO] Including javax.xml.bind:jaxb-api:jar:2.2.3 in the shaded jar.
[INFO] Including javax.xml.stream:stax-api:jar:1.0-2 in the shaded jar.
[INFO] Including org.apache.maven.plugins:maven-compiler-plugin:maven-plugin:2.3.1 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1 in the shaded jar.
[INFO] Including classworlds:classworlds:jar:1.1-alpha-2 in the shaded jar.
[INFO] Including org.apache.maven:maven-plugin-api:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-artifact:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-core:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-settings:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-file:jar:1.0-beta-2 in the shaded jar.
[INFO] Including org.apache.maven:maven-plugin-parameter-documenter:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-http-lightweight:jar:1.0-beta-2 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-http-shared:jar:1.0-beta-2 in the shaded jar.
[INFO] Including jtidy:jtidy:jar:4aug2000r7-dev in the shaded jar.
[INFO] Including org.apache.maven.reporting:maven-reporting-api:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven.doxia:doxia-sink-api:jar:1.0-alpha-7 in the shaded jar.
[INFO] Including org.apache.maven:maven-profile:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-provider-api:jar:1.0-beta-2 in the shaded jar.
[INFO] Including org.apache.maven:maven-repository-metadata:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-error-diagnostics:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-ssh-external:jar:1.0-beta-2 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-ssh-common:jar:1.0-beta-2 in the shaded jar.
[INFO] Including org.apache.maven:maven-plugin-descriptor:jar:2.0.6 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4 in the shaded jar.
[INFO] Including org.apache.maven:maven-artifact-manager:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-monitor:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven.wagon:wagon-ssh:jar:1.0-beta-2 in the shaded jar.
[INFO] Including com.jcraft:jsch:jar:0.1.27 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-utils:jar:2.0.5 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-compiler-api:jar:1.8 in the shaded jar.
[INFO] Including org.apache.maven:maven-toolchain:jar:1.0 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-compiler-manager:jar:1.8 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-compiler-javac:jar:1.8 in the shaded jar.
[INFO] Including org.apache.maven.plugins:maven-jar-plugin:maven-plugin:2.3.1 in the shaded jar.
[INFO] Including org.apache.maven:maven-project:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-plugin-registry:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-model:jar:2.0.6 in the shaded jar.
[INFO] Including org.apache.maven:maven-archiver:jar:2.4.1 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-interpolation:jar:1.13 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-archiver:jar:1.0 in the shaded jar.
[INFO] Including org.codehaus.plexus:plexus-io:jar:1.0 in the shaded jar.
[INFO] Including org.apache.maven.plugins:maven-clean-plugin:maven-plugin:2.4.1 in the shaded jar.
[INFO] Including org.apache.commons:commons-collections4:jar:4.0 in the shaded jar.
[INFO] Including com.hazelcast:hazelcast-client:jar:2.4.1 in the shaded jar.
[INFO] Including org.apache.commons:commons-pool2:jar:2.4.2 in the shaded jar.
[INFO] Including org.dataone:d1_libclient_java:jar:2.4.0-SNAPSHOT in the shaded jar.
[INFO] Including net.sf.jsignature.io-tools:easystream:jar:1.2.12 in the shaded jar.
[INFO] Including javax.mail:mail:jar:1.4.1 in the shaded jar.
[INFO] Including javax.activation:activation:jar:1.1 in the shaded jar.
[INFO] Including org.jibx:jibx-run:jar:1.2.4.5 in the shaded jar.
[INFO] Including xpp3:xpp3:jar:1.1.3.4.O in the shaded jar.
[INFO] Including org.bouncycastle:bcpkix-jdk15on:jar:1.52 in the shaded jar.
[INFO] Including org.bouncycastle:bcprov-jdk15on:jar:1.52 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient-cache:jar:4.3.6 in the shaded jar.
[INFO] Including com.googlecode.foresite-toolkit:foresite:jar:1.0-SNAPSHOT in the shaded jar.
[INFO] Including com.hp.hpl.jena:jena:jar:2.5.5 in the shaded jar.
[INFO] Including com.hp.hpl.jena:arq:jar:2.2 in the shaded jar.
[INFO] Including com.hp.hpl.jena:arq-extra:jar:2.2 in the shaded jar.
[INFO] Including com.hp.hpl.jena:jenatest:jar:2.5.5 in the shaded jar.
[INFO] Including com.hp.hpl.jena:iri:jar:0.5 in the shaded jar.
[INFO] Including com.hp.hpl.jena:concurrent-jena:jar:1.3.2 in the shaded jar.
[INFO] Including com.ibm.icu:icu4j:jar:3.4.4 in the shaded jar.
[INFO] Including com.hp.hpl.jena:json-jena:jar:1.0 in the shaded jar.
[INFO] Including stax:stax-api:jar:1.0 in the shaded jar.
[INFO] Including org.codehaus.woodstox:wstx-asl:jar:3.0.0 in the shaded jar.
[INFO] Including xerces:xmlParserAPIs:jar:2.0.2 in the shaded jar.
[INFO] Including rome:rome:jar:0.9 in the shaded jar.
[INFO] Including jdom:jdom:jar:1.0 in the shaded jar.
[INFO] Including xalan:xalan:jar:2.7.0 in the shaded jar.
[INFO] Including org.dataone:d1_cn_index_common:jar:2.4.0-SNAPSHOT in the shaded jar.
[INFO] Including org.dataone:d1_cn_index_generator:jar:2.4.0-SNAPSHOT in the shaded jar.
[INFO] Including com.hazelcast:hazelcast:jar:2.4.1 in the shaded jar.
[INFO] Including com.hazelcast:hazelcast-spring:jar:2.4.1 in the shaded jar.
[INFO] Including org.quartz-scheduler:quartz:jar:2.1.1 in the shaded jar.
[INFO] Including c3p0:c3p0:jar:0.9.1.1 in the shaded jar.
[INFO] Including org.springframework.data:spring-data-jpa:jar:1.4.5.RELEASE in the shaded jar.
[INFO] Including org.aspectj:aspectjrt:jar:1.7.2 in the shaded jar.
[INFO] Including org.slf4j:jcl-over-slf4j:jar:1.7.1 in the shaded jar.
[INFO] Including org.springframework.data:spring-data-commons:jar:1.6.5.RELEASE in the shaded jar.
[INFO] Including org.javassist:javassist:jar:3.18.2-GA in the shaded jar.
[INFO] Including org.hibernate:hibernate-entitymanager:jar:3.6.10.Final in the shaded jar.
[INFO] Including org.hibernate:hibernate-core:jar:3.6.10.Final in the shaded jar.
[INFO] Including antlr:antlr:jar:2.7.6 in the shaded jar.
[INFO] Including org.hibernate:hibernate-commons-annotations:jar:3.2.0.Final in the shaded jar.
[INFO] Including javax.transaction:jta:jar:1.1 in the shaded jar.
[INFO] Including org.hibernate.javax.persistence:hibernate-jpa-2.0-api:jar:1.0.1.Final in the shaded jar.
[INFO] Including cglib:cglib:jar:3.1 in the shaded jar.
[INFO] Including commons-dbcp:commons-dbcp:jar:1.2.2 in the shaded jar.
[INFO] Including commons-pool:commons-pool:jar:1.3 in the shaded jar.
[INFO] Including postgresql:postgresql:jar:8.4-702.jdbc4 in the shaded jar.
[INFO] Including org.springframework:spring-core:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-asm:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-beans:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-context:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-expression:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-aop:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including aopalliance:aopalliance:jar:1.0 in the shaded jar.
[INFO] Including org.springframework:spring-context-support:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-tx:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-orm:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-jdbc:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-web:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including org.springframework:spring-test:jar:3.1.4.RELEASE in the shaded jar.
[INFO] Including commons-daemon:commons-daemon:jar:1.0.1 in the shaded jar.
[INFO] Including commons-io:commons-io:jar:2.0.1 in the shaded jar.
[INFO] Including org.apache.commons:commons-lang3:jar:3.5 in the shaded jar.
[INFO] Including commons-cli:commons-cli:jar:1.2 in the shaded jar.
[INFO] Including commons-logging:commons-logging:jar:1.1.1 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.5 in the shaded jar.
[INFO] Including org.slf4j:slf4j-log4j12:jar:1.7.5 in the shaded jar.
[INFO] Including log4j:apache-log4j-extras:jar:1.2.17 in the shaded jar.
[INFO] Including net.minidev:json-smart:jar:1.0.9 in the shaded jar.
[INFO] Including org.apache.jena:jena-tdb:jar:3.7.0 in the shaded jar.
[INFO] Including org.apache.jena:jena-arq:jar:3.7.0 in the shaded jar.
[INFO] Including org.apache.jena:jena-core:jar:3.7.0 in the shaded jar.
[INFO] Including org.apache.jena:jena-iri:jar:3.7.0 in the shaded jar.
[INFO] Including org.apache.jena:jena-base:jar:3.7.0 in the shaded jar.
[INFO] Including org.apache.commons:commons-csv:jar:1.5 in the shaded jar.
[INFO] Including com.github.andrewoma.dexx:collection:jar:0.7 in the shaded jar.
[INFO] Including org.apache.jena:jena-shaded-guava:jar:3.7.0 in the shaded jar.
[INFO] Including com.github.jsonld-java:jsonld-java:jar:0.11.1 in the shaded jar.
[INFO] Including com.fasterxml.jackson.core:jackson-core:jar:2.9.0 in the shaded jar.
[INFO] Including com.fasterxml.jackson.core:jackson-databind:jar:2.9.0 in the shaded jar.
[INFO] Including com.fasterxml.jackson.core:jackson-annotations:jar:2.9.0 in the shaded jar.
[INFO] Including org.apache.thrift:libthrift:jar:0.10.0 in the shaded jar.
[INFO] Including commons-collections:commons-collections:jar:3.2.1 in the shaded jar.
[INFO] Including ch.hsr:geohash:jar:1.0.10 in the shaded jar.
[INFO] Including net.sf.saxon:Saxon-HE:jar:9.9.1-3 in the shaded jar.
[WARNING] We have a duplicate org/w3c/dom/Attr.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/CDATASection.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/CharacterData.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Comment.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/DOMException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/DOMImplementation.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Document.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/DocumentFragment.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/DocumentType.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Element.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Entity.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/EntityReference.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/NamedNodeMap.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Node.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/NodeList.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Notation.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/ProcessingInstruction.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/w3c/dom/Text.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/AttributeList.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/Attributes.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/ContentHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/DTDHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/DocumentHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/EntityResolver.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/ErrorHandler.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/HandlerBase.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/InputSource.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/Locator.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/Parser.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/SAXException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/SAXNotRecognizedException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/SAXNotSupportedException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/SAXParseException.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/XMLFilter.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate org/xml/sax/XMLReader.class in /var/lib/jenkins/.m2/repository/jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar
[WARNING] We have a duplicate javax/xml/namespace/QName.class in /var/lib/jenkins/.m2/repository/xpp3/xpp3/1.1.3.4.O/xpp3-1.1.3.4.O.jar
[WARNING] We have a duplicate javax/xml/XMLConstants.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/namespace/NamespaceContext.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/namespace/QName.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/EventFilter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/FactoryConfigurationError.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/FactoryFinder$1.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/FactoryFinder$ClassLoaderFinder.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/FactoryFinder$ClassLoaderFinderConcrete.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/FactoryFinder.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/Location.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/StreamFilter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLEventFactory.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLEventReader.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLEventWriter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLInputFactory.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLOutputFactory.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLReporter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLResolver.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLStreamConstants.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLStreamException.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLStreamReader.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/XMLStreamWriter.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/Attribute.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/Characters.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/Comment.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/DTD.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/EndDocument.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/EndElement.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/EntityDeclaration.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/EntityReference.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/Namespace.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/NotationDeclaration.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/ProcessingInstruction.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/StartDocument.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/StartElement.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/events/XMLEvent.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/util/EventReaderDelegate.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/util/StreamReaderDelegate.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/util/XMLEventAllocator.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate javax/xml/stream/util/XMLEventConsumer.class in /var/lib/jenkins/.m2/repository/stax/stax-api/1.0/stax-api-1.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/CommonConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/ReaderConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/ValidatorConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/WriterConfig.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/WstxInputProperties$ParsingMode.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/WstxInputProperties.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/api/WstxOutputProperties.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/cfg/ErrorConsts.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/cfg/InputConfigFlags.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/cfg/OutputConfigFlags.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/cfg/ParsingErrorMsgs.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/cfg/XmlConsts.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/ChoiceContentSpec$Validator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/ChoiceContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/ChoiceModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/ConcatModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/ContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DFAState.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DFAValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDAttribute.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDCdataAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDEntitiesAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDEntityAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDEnumAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDId.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDIdAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDIdRefAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDIdRefsAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDNmTokenAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDNmTokensAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDNotationAttr.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDSchemaFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDSubset.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDSubsetImpl.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDTypingNonValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDValidatorBase.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DTDWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DefaultAttrValue$UndeclaredEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/DefaultAttrValue.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/EmptyValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/FullDTDReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/MinimalDTDReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/ModelNode.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/OptionalModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/SeqContentSpec$Validator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/SeqContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/StarModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/StructValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/TokenContentSpec$Validator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/TokenContentSpec.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/dtd/TokenModel.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/ent/EntityDecl.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/ent/ExtEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/ent/IntEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/ent/ParsedExtEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/ent/UnparsedExtEntity.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/BaseStartElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/CompactStartElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/DefaultEventAllocator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/MergedNsContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/SimpleStartElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/WDTD.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/WEntityDeclaration.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/WEntityReference.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/WNotationDeclaration.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/evt/WstxEventReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxEOFException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxIOException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxLazyException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxOutputException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxParsingException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxUnexpectedCharException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/exc/WstxValidationException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/AsciiReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/BaseInputSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/BaseReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/BranchingReaderSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/BufferRecycler.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/CharArraySource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/CharsetNames.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/DefaultInputResolver.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/ISOLatinReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/InputBootstrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/InputSourceFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/MergedReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/MergedStream.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/ReaderBootstrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/ReaderSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/StreamBootstrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/TextEscaper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/UTF32Reader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/UTF8Reader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/UTF8Writer.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/WstxInputData.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/WstxInputLocation.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/io/WstxInputSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/msv/AttributeProxy.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/msv/RelaxNGSchema.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/msv/RelaxNGSchemaFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/AttributeCollector.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/BasicStreamReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/CompactNsContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/ElemAttrs.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/ElemCallback.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/InputElementStack.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/InputProblemReporter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/NsDefaultProvider.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/ReaderCreator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/StreamReaderImpl.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/StreamScanner.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sr/ValidatingStreamReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/stax/WstxEventFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/stax/WstxInputFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/stax/WstxOutputFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/AsciiXmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/BaseNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/BaseStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/BufferingXmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/EncodingXmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/ISOLatin1XmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/NonNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/RepairingNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/SimpleNsStreamWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/SimpleOutputElement$AttrName.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/SimpleOutputElement.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriter.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriterWrapper$RawWrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriterWrapper$TextWrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/sw/XmlWriterWrapper.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/ArgUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/BaseNsContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/BijectiveNsMap.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/DataUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/DefaultXmlSymbolTable.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/EmptyNamespaceContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/ExceptionUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/InternCache.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/SimpleCache.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/StringUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/StringVector.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/SymbolTable$Bucket.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/SymbolTable.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/TextAccumulator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/TextBuffer$BufferReader.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/TextBuffer.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/TextBuilder.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/URLUtil.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/WordResolver$1.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/WordResolver$Builder.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/WordResolver.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/WordSet$Builder.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/WordSet.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate com/ctc/wstx/util/XmlChars.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/AttributeInfo.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/DTDInfo.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/LocationInfo.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLEventReader2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLInputFactory2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLOutputFactory2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLStreamLocation2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLStreamProperties.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLStreamReader2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/XMLStreamWriter2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/evt/DTD2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/evt/XMLEvent2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/evt/XMLEventFactory2.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/EscapingWriterFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2BlockResult.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2BlockSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2ByteArraySource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2CharArraySource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2FileResult.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2FileSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2ReferentialResult.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2ReferentialSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2Result.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2Source.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2StringSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/io/Stax2URLSource.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/AttributeContainer.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/DTDValidationSchema.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/Validatable.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/ValidationContext.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/ValidationProblemHandler.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/ValidatorPair.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationException.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationProblem.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationSchema.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidationSchemaFactory.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/codehaus/stax2/validation/XMLValidator.class in /var/lib/jenkins/.m2/repository/org/codehaus/woodstox/wstx-asl/3.0.0/wstx-asl-3.0.0.jar
[WARNING] We have a duplicate org/xml/sax/InputSource.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/SAXException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/EntityResolver.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/ErrorHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/SAXNotRecognizedException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/SAXNotSupportedException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/SAXParseException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/Locator.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/Parser.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/XMLReader.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/ContentHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/DocumentHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/DTDHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/ext/DeclHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/ext/LexicalHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/AttributeList.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/Attributes.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/HandlerBase.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/DefaultHandler.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/NewInstance.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/ParserFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/NamespaceSupport$Context.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/NamespaceSupport.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/AttributesImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/XMLReaderAdapter$AttributesAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/XMLReaderAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/XMLFilterImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/XMLReaderFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/LocatorImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/AttributeListImpl.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/ParserAdapter$AttributeListAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/helpers/ParserAdapter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/xml/sax/XMLFilter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Element.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Node.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Document.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/NodeList.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/events/EventTarget.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/events/DocumentEvent.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/events/EventListener.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/events/Event.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/events/EventException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/events/MutationEvent.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/DocumentType.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/CDATASection.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Text.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/CharacterData.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Entity.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/traversal/DocumentTraversal.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/traversal/NodeFilter.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/traversal/NodeIterator.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/traversal/TreeWalker.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/ranges/DocumentRange.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/ranges/Range.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/ranges/RangeException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Attr.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/DOMException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/NamedNodeMap.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/DOMImplementation.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/DocumentFragment.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Comment.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/ProcessingInstruction.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/EntityReference.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/Notation.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLOptGroupElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLDocument.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLFormElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLCollection.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLQuoteElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLHRElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTableRowElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLScriptElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLAppletElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLMapElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLOptionElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLLegendElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLHeadingElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLIFrameElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLOListElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLButtonElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLDivElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLDOMImplementation.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLFieldSetElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLBRElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLPreElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTableCaptionElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLMetaElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLModElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLBaseFontElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTitleElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLDirectoryElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLStyleElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLImageElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLIsIndexElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTableCellElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTableColElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLLabelElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLFrameSetElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLSelectElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLParagraphElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLHtmlElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLLinkElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLFontElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLFrameElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLBodyElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTableSectionElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLAnchorElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLBaseElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLObjectElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLInputElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLMenuElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLLIElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLParamElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLUListElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLDListElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLAreaElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTextAreaElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLTableElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/w3c/dom/html/HTMLHeadElement.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/DocumentBuilderFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/DocumentBuilder.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/ParserConfigurationException.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/FactoryConfigurationError.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/SAXParser.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/SAXParserFactory.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/FactoryFinder$ConfigurationError.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate javax/xml/parsers/FactoryFinder.class in /var/lib/jenkins/.m2/repository/xerces/xmlParserAPIs/2.0.2/xmlParserAPIs-2.0.2.jar
[WARNING] We have a duplicate org/apache/commons/logging/impl/NoOpLog.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
[WARNING] We have a duplicate org/apache/commons/logging/impl/SimpleLog$1.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
[WARNING] We have a duplicate org/apache/commons/logging/impl/SimpleLog.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
[WARNING] We have a duplicate org/apache/commons/logging/Log.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
[WARNING] We have a duplicate org/apache/commons/logging/LogConfigurationException.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
[WARNING] We have a duplicate org/apache/commons/logging/LogFactory.class in /var/lib/jenkins/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
[WARNING] We have a duplicate org/apache/log4j/Appender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/AppenderSkeleton.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/AsyncAppender$DiscardSummary.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/AsyncAppender$Dispatcher.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/AsyncAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/BasicConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Category.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/CategoryKey.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/ConsoleAppender$SystemErrStream.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/ConsoleAppender$SystemOutStream.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/ConsoleAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/DailyRollingFileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/DefaultCategoryFactory.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/DefaultThrowableRenderer.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Dispatcher.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/EnhancedPatternLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/EnhancedThrowableRenderer.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/FileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/HTMLLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Hierarchy.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Layout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Level.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/LogMF.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/LogManager.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/LogSF.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/LogXF.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Logger.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/MDC.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/NDC$DiagnosticContext.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/NDC.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/NameValue.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/PatternLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/Priority.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/PropertyConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/PropertyWatchdog.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/ProvisionNode.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/RollingCalendar.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/RollingFileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/SimpleLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/SortedKeyEnumeration.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/TTCCLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/WriterAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/BridgePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/BridgePatternParser.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/CachedDateFormat.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/ClassNamePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/DatePatternConverter$DefaultZoneDateFormat.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/DatePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/FileDatePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/FileLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/FormattingInfo.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/FullLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/IntegerPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LevelPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LineLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LineSeparatorPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LiteralPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LogEvent.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LoggerPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/LoggingEventPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/MessagePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/MethodLocationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NDCPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$DropElementAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$MaxElementAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$NOPAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$PatternAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator$PatternAbbreviatorFragment.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NameAbbreviator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/NamePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/PatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/PatternParser$ReadOnlyMap.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/PatternParser.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/PropertiesPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/RelativeTimePatternConverter$CachedTimestamp.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/RelativeTimePatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/SequenceNumberPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/ThreadPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/pattern/ThrowableInformationPatternConverter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/AppenderAttachable.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/Configurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/DefaultRepositorySelector.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/ErrorCode.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/ErrorHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/Filter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/HierarchyEventListener.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/LocationInfo.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/LoggerFactory.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/LoggerRepository.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/LoggingEvent.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/NOPLogger.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/NOPLoggerRepository.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/NullWriter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/OptionHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/RendererSupport.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/RepositorySelector.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/RootCategory.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/RootLogger.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/ThrowableInformation.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/ThrowableRenderer.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/ThrowableRendererSupport.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/TriggeringEventEvaluator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/spi/VectorWriter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/DenyAllFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/ExternallyRolledFileAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/FallbackErrorHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/HUP.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/HUPNode.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/LevelMatchFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/LevelRangeFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/NullAppender.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/ReloadingPropertyConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/Roller.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/varia/StringMatchFilter.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$1.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$2.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$3.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$4.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$5.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator$ParseAction.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/DOMConfigurator.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/Log4jEntityResolver.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/SAXErrorHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/UnrecognizedElementHandler.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/XMLLayout.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/log4j/xml/XMLWatchdog.class in /var/lib/jenkins/.m2/repository/log4j/apache-log4j-extras/1.2.17/apache-log4j-extras-1.2.17.jar
[WARNING] We have a duplicate org/apache/commons/collections/ArrayStack.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/Buffer.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/BufferUnderflowException.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$1.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$CollectionView$CollectionViewIterator.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$CollectionView.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$EntrySet.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$KeySet.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap$Values.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[WARNING] We have a duplicate org/apache/commons/collections/FastHashMap.class in /var/lib/jenkins/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
[INFO] 
[INFO] --- maven-failsafe-plugin:2.8.1:integration-test (integration-test) @ d1_cn_index_processor ---
[INFO] Failsafe report directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/failsafe-reports

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.dataone.cn.indexer.solrhttp.SolrJClientIT
[ERROR] 2019-11-12 04:28:11,886 [main]  (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/node.properties
[ERROR] 2019-11-12 04:28:11,896 [main]  (org.dataone.configuration.Settings:getConfiguration:109) org.apache.commons.configuration.ConfigurationException while loading config file org/dataone/configuration/config.xml. Message: org.apache.commons.configuration.ConfigurationRuntimeException: org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source file:/etc/dataone/index/d1client.properties
Nov 12, 2019 4:28:11 AM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
Nov 12, 2019 4:28:12 AM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Nov 12, 2019 4:28:12 AM com.hazelcast.impl.AddressPicker
WARNING: Picking loopback address [127.0.0.1]; setting 'java.net.preferIPv4Stack' to true.
Nov 12, 2019 4:28:12 AM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5720, using socket ServerSocket[addr=/0.0.0.0,localport=5720], bind any local is true
Nov 12, 2019 4:28:12 AM com.hazelcast.system
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5720
Nov 12, 2019 4:28:12 AM com.hazelcast.system
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Nov 12, 2019 4:28:12 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTING
Nov 12, 2019 4:28:12 AM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5720 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5720 this
}

Nov 12, 2019 4:28:12 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTED
[ INFO] 2019-11-12 04:28:13,633 [main]  (org.dataone.cn.indexer.solrhttp.SolrSchema:loadSolrSchemaDocument:298) loading schema document from path: ./src/test/resources/org/dataone/cn/index/resources/solr5home/collection1/conf/schema.xml
[ WARN] 2019-11-12 04:28:13,911 [main]  (org.dataone.cn.index.util.PerformanceLogger:<init>:19) Setting up PerformanceLogger: set to enabled? true
[ WARN] 2019-11-12 04:28:14,887 [main]  (org.dataone.cn.index.processor.IndexTaskProcessor:<init>:162) IndexTaskProcessor initialized with stated number of threads = 5
Nov 12, 2019 4:28:16 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTTING_DOWN
Nov 12, 2019 4:28:17 AM com.hazelcast.initializer
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Destroying node initializer.
Nov 12, 2019 4:28:17 AM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Shutdown is completed in 641 ms.
Nov 12, 2019 4:28:17 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTDOWN
Tests run: 6, Failures: 0, Errors: 0, Skipped: 5, Time elapsed: 6.018 sec
Running org.dataone.cn.indexer.solrhttp.SolrUpdatePerformanceIT
Nov 12, 2019 4:28:17 AM com.hazelcast.config.ClasspathXmlConfig
INFO: Configuring Hazelcast from 'org/dataone/configuration/hazelcast.xml'.
Nov 12, 2019 4:28:17 AM com.hazelcast.impl.AddressPicker
INFO: Interfaces is disabled, trying to pick one address from TCP-IP config addresses: [127.0.0.1]
Nov 12, 2019 4:28:17 AM com.hazelcast.impl.AddressPicker
INFO: Picked Address[127.0.0.1]:5720, using socket ServerSocket[addr=/0.0.0.0,localport=5720], bind any local is true
Nov 12, 2019 4:28:17 AM com.hazelcast.system
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Community Edition 2.4.1 (20121213) starting at Address[127.0.0.1]:5720
Nov 12, 2019 4:28:17 AM com.hazelcast.system
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Copyright (C) 2008-2012 Hazelcast.com
Nov 12, 2019 4:28:17 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTING
Nov 12, 2019 4:28:17 AM com.hazelcast.impl.TcpIpJoiner
INFO: [127.0.0.1]:5720 [DataONEBuildTest] 


Members [1] {
	Member [127.0.0.1]:5720 this
}

Nov 12, 2019 4:28:17 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is STARTED
TestSolrUpdate-1573532897644
[ WARN] 2019-11-12 04:28:17,990 [main]  (org.dataone.client.auth.CertificateManager:<init>:203) FileNotFound: No certificate installed in the default location: /tmp/x509up_u107
[ WARN] 2019-11-12 04:28:18,037 [main]  (org.dataone.client.utils.HttpConnectionMonitorService$SingletonHolder:<clinit>:46) Starting monitor thread
[ WARN] 2019-11-12 04:28:18,037 [Thread-6]  (org.dataone.client.utils.HttpConnectionMonitorService:run:96) Starting monitoring...
[ WARN] 2019-11-12 04:28:18,038 [main]  (org.dataone.client.utils.HttpConnectionMonitorService:addMonitor:65) registering ConnectionManager...
[ WARN] 2019-11-12 04:28:18,333 [main]  (org.apache.http.client.protocol.ResponseProcessCookies:processCookies:121) Cookie rejected [JSESSIONID="55D5D89FDDB2F0373D51242975C98C0E", version:0, domain:cn-dev.test.dataone.org, path:/metacat, expiry:null] Illegal path attribute "/metacat". Path of origin: "/cn/v2/formats"
[ INFO] 2019-11-12 04:28:18,400 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,401 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6889f56f
[ INFO] 2019-11-12 04:28:18,401 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532897644
[ WARN] 2019-11-12 04:28:18,401 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-0 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898401
[ INFO] 2019-11-12 04:28:18,405 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,405 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@3b4ef59f
[ INFO] 2019-11-12 04:28:18,406 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898401
[ WARN] 2019-11-12 04:28:18,406 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-1 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898406
[ INFO] 2019-11-12 04:28:18,409 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,409 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@36c07c75
[ INFO] 2019-11-12 04:28:18,410 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898406
[ WARN] 2019-11-12 04:28:18,410 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-2 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898410
[ INFO] 2019-11-12 04:28:18,413 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,413 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@31ab1e67
[ INFO] 2019-11-12 04:28:18,413 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898410
[ WARN] 2019-11-12 04:28:18,413 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-3 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898413
[ INFO] 2019-11-12 04:28:18,417 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,417 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@67f946c3
[ INFO] 2019-11-12 04:28:18,417 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898413
[ WARN] 2019-11-12 04:28:18,417 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-4 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898417
[ INFO] 2019-11-12 04:28:18,421 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,421 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@4fd74223
[ INFO] 2019-11-12 04:28:18,421 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898417
[ WARN] 2019-11-12 04:28:18,421 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-5 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898421
[ INFO] 2019-11-12 04:28:18,424 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,424 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@6812c8cc
[ INFO] 2019-11-12 04:28:18,424 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898421
[ WARN] 2019-11-12 04:28:18,424 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-6 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898425
[ INFO] 2019-11-12 04:28:18,428 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,428 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@1df9186f
[ INFO] 2019-11-12 04:28:18,428 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898425
[ WARN] 2019-11-12 04:28:18,428 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-7 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898428
[ INFO] 2019-11-12 04:28:18,431 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,431 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@66e341e9
[ INFO] 2019-11-12 04:28:18,431 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898428
[ WARN] 2019-11-12 04:28:18,431 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-8 is null, so skipping processing with content subprocessors
TestSolrUpdate-1573532898431
[ INFO] 2019-11-12 04:28:18,434 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:78) number of documents from parseDocuments: 1
[ INFO] 2019-11-12 04:28:18,434 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:94)  main document from parseDocuments: org.dataone.cn.indexer.solrhttp.SolrDoc@216c22ce
[ INFO] 2019-11-12 04:28:18,434 [main]  (org.dataone.cn.indexer.AbstractStubMergingSubprocessor:processDocument:96)  main document id from parseDocuments: TestSolrUpdate-1573532898431
[ WARN] 2019-11-12 04:28:18,434 [main]  (org.dataone.cn.indexer.SolrIndexServiceV2:processInsertTask:342) The optional objectPath for pid solrUpdateTestSeries-1573532897642-9 is null, so skipping processing with content subprocessors
========================
 iterations: 10
 avg. time ms: 64
========================
===========================
Queries: 0
   Total time: 0
   Avg time: 0

===========================
Updates: 0
   Total time: 0
   Avg time: 0
   commit within (ms): 250

Nov 12, 2019 4:28:20 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTTING_DOWN
Nov 12, 2019 4:28:21 AM com.hazelcast.initializer
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Destroying node initializer.
Nov 12, 2019 4:28:21 AM com.hazelcast.impl.Node
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Hazelcast Shutdown is completed in 689 ms.
Nov 12, 2019 4:28:21 AM com.hazelcast.impl.LifecycleServiceImpl
INFO: [127.0.0.1]:5720 [DataONEBuildTest] Address[127.0.0.1]:5720 is SHUTDOWN
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 3.517 sec
Running org.dataone.cn.indexer.solrhttp.SolrClientUpdateMechanismIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 0.002 sec

Results :

Tests run: 12, Failures: 0, Errors: 0, Skipped: 9

[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent!
[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-failsafe-plugin:2.8.1:verify (verify) @ d1_cn_index_processor ---
[INFO] Failsafe report directory: /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/failsafe-reports
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent!
[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ d1_cn_index_processor ---
[INFO] Installing /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/target/d1_cn_index_processor-2.4.0-SNAPSHOT.jar to /var/lib/jenkins/.m2/repository/org/dataone/d1_cn_index_processor/2.4.0-SNAPSHOT/d1_cn_index_processor-2.4.0-SNAPSHOT.jar
[INFO] Installing /var/lib/jenkins/jobs/d1_cn_index_processor/workspace/pom.xml to /var/lib/jenkins/.m2/repository/org/dataone/d1_cn_index_processor/2.4.0-SNAPSHOT/d1_cn_index_processor-2.4.0-SNAPSHOT.pom
Notifying upstream projects of job completion
Join notifier requires a CauseAction
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  02:41 min
[INFO] Finished at: 2019-11-12T04:28:22Z
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data