SparkDeploySchedulerBackend错误:应用程序已被终止。所有大师都没有反应

时间:2015-05-10 06:46:50

标签: apache-spark

我正在启动Spark shell:

private static final float COEFFICIENT = 0.0001f;
private static final float UPPER_BOUND_ALPHA = 0.8f;
private static final float LOWER_BOUND_ALPHA = 0.3f;

@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY)
{
    float startingX = (int)e1.getRawX();
    float startingY = (int)e1.getRawY();

    float endingX = (int)e2.getRawX();
    float endingY = (int)e2.getRawY();

    float deltaX = startingX - endingX;
    float deltaY = startingY - endingY;

    // swipe horizontal?
    if(Math.abs(deltaX) > MIN_DISTANCE && Math.abs(deltaY) < MIN_DISTANCE_Y) 
    {
        if (gestureLayout.getVisibility() == View.GONE) 
        {
            gestureLayout.setVisibility(View.VISIBLE);
            gestureLayout.setAlpha(LOWER_BOUND_ALPHA);
        }

        // left or right
        if(deltaX > 0)
        {
            shouldCallBtn = SHOULD_CALL_WHERE;
            gestureText.setText(getString(R.string.option_where));
        }
        else if(deltaX < 0)
        {
            shouldCallBtn = SHOULD_CALL_ONMYWAY;
            gestureText.setText(getString(R.string.option_onway));
        }

        float alphaValue = gestureLayout.getAlpha() + COEFFICIENT * deltaX;
        if (alphaValue > UPPER_BOUND_ALPHA) 
        {
            alphaValue = UPPER_BOUND_ALPHA;
        } 
        else if (alphaValue < LOWER_BOUND_ALPHA) 
        {
            alphaValue = LOWER_BOUND_ALPHA;
        }

        gestureLayout.setAlpha(alphaValue);

        Log.d("DELTA VALUES", deltaX + "  ==  " + lastDeltaValue + "   " + gestureLayout.getAlpha());
        lastDeltaValue = deltaX;
    }

    return false;
}

我收到以下错误:

Route::group(['prefix' => 'i'], function()
{
    Route::get('guide', function()
    {
        // Matches The "/i/guide" URL
    });
});

我按照以下链接安装了火花: - http://www.philchen.com/2015/02/16/how-to-install-apache-spark-and-cassandra-stack-on-ubuntu

4 个答案:

答案 0 :(得分:4)

您应该在启动spark-shell

时提供Spark Cluster的主URL

至少:

bin/spark-shell --master spark://master-ip:7077

所有选项都构成一个长列表,您可以自己找到合适的选项:

bin/spark-shell --help

答案 1 :(得分:0)

我假设你在独立/本地模式下运行它。 用以下行运行你的火花壳。这表示您正在使用主服务器的所有可用内核,即本地计算机。

bin/spark-shell --master local[*]

http://spark.apache.org/docs/1.2.1/submitting-applications.html#master-urls

答案 2 :(得分:0)

查看您的日志文件&#34;权限被拒绝&#34;错误......您的客户服务可能没有适当的权限来访问您的主文件夹。

答案 3 :(得分:0)

在提供spark-submit命令之前,您还需要启动spark master和slave

start-master.sh
start-slave.sh spark://spark:7077

然后使用

spark-submit --master spark://spark:7077