neo4j spark connector:mvn clean install assembly:单个错误

时间:2016-08-24 14:05:31

标签: maven apache-spark neo4j connector

我正在研究neo4j-spark连接器。克隆并移动到目录后,我做了mvn clean install assembly:single。它抛出错误:

Tests in error: 
  runMatrixQueryDFSchema(org.neo4j.spark.Neo4jDataFrameTest): Job aborted due to stage failure: Task 0 in stage 3.0 failed 1 times, most recent failure: Lost task 0.0 in stage 3.0 (TID 12, localhost): java.util.NoSuchElementException: None.get

Tests run: 7, Failures: 0, Errors: 1, Skipped: 1

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 42.538s
[INFO] Finished at: Wed Aug 24 19:18:07 IST 2016
[INFO] Final Memory: 72M/1016M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.10:test (default-test) on project neo4j-spark-connector: There are test failures.
[ERROR] 
[ERROR] Please refer to /data/neo4j-spark-connector/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

我如何摆脱它?我是否需要为Scala 2.11构建target / neo4j-spark-connector_2.11-full-2.0.0-M1.jar。我已经安装了Spark 2.0.0。我把它保存在与neo4j-spark-connector相同的目录中。我需要将它保存在其他地方吗?我如何让它工作?

1 个答案:

答案 0 :(得分:0)

此问题现已解决。您可以从neo4j-spark-connector repository获取最新信息并使其生效。

感谢。