&#39 ;;'预期,但'进口'发现 - Scala和Spark

时间:2015-05-24 10:59:26

标签: scala apache-spark compiler-errors apache-spark-mllib

我尝试使用Spark和Scala,编译一个独立的应用程序。我不知道为什么我会收到此错误:

-(void) generateCardViews {
int positionsLeftInRow = _BUTTONS_PER_ROW;
int j = 0; // j = ROWNUMBER (j = 0) = ROW1, (j = 1) = ROW2...

for (int i = 0; i < [self.gameModel.buttons count]; i++) {
    NSInteger value = ((ButtonModel *)self.gameModel.buttons[i]).value;

    CGFloat x = (i % _BUTTONS_PER_ROW) * 121 + (i % _BUTTONS_PER_ROW) * 40 + 285;
    if (j == 1) {
        x += 80; // set additional indent (horizontal displacement)
    }
    if (j == 2) {
        x -= 160;
    }

    CGFloat y = j * 122 + j * 40 + 158;
    CGRect frame = CGRectMake(x, y, 125, 125);




    ButtonView *cv = [[ButtonView alloc] initWithFrame:frame andPosition:i andValue:value];

    if (!((ButtonModel *)self.gameModel.buttons[i]).outOfPlay) {
        [self.boardView addSubview:cv];

       if ([self.gameModel.turnedButtons containsObject: self.gameModel.buttons[i]]) {
            [self.turnedButtonViews addObject: cv];
            [cv flip];
        }
    }

    if (--positionsLeftInRow == 0) {
        j++;
        positionsLeftInRow = _BUTTONS_PER_ROW;
        if (j == 1) {
            positionsLeftInRow = _BUTTONS_PER_ROW-1;

        if (j == 2) {
            positionsLeftInRow = _BUTTONS_PER_ROW-2;
        }}
    }
}
}

这是build.sbt代码:

topicModel.scala:2: ';' expected but 'import' found.
[error] import org.apache.spark.mllib.clustering.LDA
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed

这些是进口商品:

name := "topicModel"

version := "1.0"

scalaVersion := "2.11.6"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.3.1"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1"

1 个答案:

答案 0 :(得分:0)

这可能是因为您的文件有旧的Macintosh行结尾(\ r)?

有关详细信息,请参阅Why do I need semicolons after these imports?