通过Bazel使用Spark库的ClassNotFoundException

时间:2018-06-25 04:52:12

标签: bazel spark-java

我试图在Spark中使用Bazel构建“ hello world”服务器,但出现此错误:

<form method="GET">
        <input type="number" name="num1">
        <input type="number" name="num2">
        <select name="operator">
            <option value='+'>Add</option>
            <option value='-'>Subtract</option>
            <option value='*'>Multiply</option>
            <option value='/'>Divide</option>
        </select>
        <button type="submit" value="submit" name="submit">calculate</button>
        <p>Answer :</p>
</form>

<?php
    if (isset($_GET['submit'])){
        $num1 = $_GET['num1'];
        $num2 = $_GET['num2'];
        $operator = $_GET['operator'];


            switch ($operator){

                case "+":
                    echo $num1 + $num2;
                break;
                case "-":
                    echo $num1 - $num2;
                break;
                case "*":
                    echo $num1 * $num2;
                break;
                case "/":
                    echo $num1 / $num2;
                break;

            }
        }

    ?>

已建立:

$ bazel run //:app
INFO: Analysed target //:app (0 packages loaded).
INFO: Found 1 target...
Target //:app up-to-date:
  bazel-bin/app.jar
  bazel-bin/app
INFO: Elapsed time: 0.201s, Critical Path: 0.00s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
INFO: Build completed successfully, 1 total action
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
        at spark.Service.<clinit>(Service.java:56)
        at spark.Spark$SingletonHolder.<clinit>(Spark.java:51)
        at spark.Spark.getInstance(Spark.java:55)
        at spark.Spark.<clinit>(Spark.java:61)
        at io.app.server.Main.main(Main.java:7)
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 5 more

如果我不包含slf4j,也会发生相同的错误,并且它不应该是spark的必需依赖项。

工作空间:

java_binary(
    name = "app",
    main_class = "io.app.server.Main",
    srcs = ["src/main/java/io/app/server/Main.java"],
    deps = [
        "@org_slf4j_slf4j_simple//jar",
        "@com_sparkjava_spark_core//jar",
    ]
)

最后是src / main / java / io / app / server / Main.java:

maven_jar(
    name = "com_sparkjava_spark_core",
    artifact = "com.sparkjava:spark-core:2.7.2"
)

maven_jar(
    name = "org_slf4j_slf4j_simple",
    artifact = "org.slf4j:slf4j-simple:1.7.21"
)

任何关于我在这里可能做错事情的想法吗?

1 个答案:

答案 0 :(得分:3)

找到了我所缺少的。看来maven_jar不会自动获取该库本身具有的{transparent依赖项,see this

  

Bazel仅读取WORKSPACE文件中列出的依赖项。如果你的   项目(A)依赖于另一个项目(B),该项目列出了对   在其WORKSPACE文件中的第三个项目(C)中,您必须同时添加B   和C到您项目的WORKSPACE文件。这个要求会迅速增加   WORKSPACE文件大小,但希望限制   一个库包含1.0版的C,另一个库包含2.0版的C。

     

可以使用该工具生成大的WORKSPACE文件   generate_workspace。有关详细信息,请参见生成外部依赖项。   来自Maven项目。

因此解决方案似乎是编写pom.xml并使用generate_workspace

编辑:generate_workspace似乎已被弃用,请改用bazel_deps