我正在尝试运行Logistic回归示例(https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaLogisticRegressionWithElasticNetExample.java)
这是代码:
saveInBackground()
我也在使用示例的同一个文件(https://github.com/apache/spark/blob/master/data/mllib/sample_libsvm_data.txt) 但是我得到了这些错误:
public final class GettingStarted {
public static void main(final String[] args) throws InterruptedException {
System.setProperty("hadoop.home.dir", "C:\\winutils");
SparkSession spark = SparkSession
.builder()
.appName("JavaLogisticRegressionWithElasticNetExample")
.config("spark.master", "local")
.getOrCreate();
// $example on$
// Load training data
Dataset<Row> training = spark.read().format("libsvm").load("data/mllib/sample_libsvm_data.txt");
LogisticRegression lr = new LogisticRegression()
.setMaxIter(10)
.setRegParam(0.3)
.setElasticNetParam(0.8);
// Fit the model
LogisticRegressionModel lrModel = lr.fit(training);
// Print the coefficients and intercept for logistic regression
System.out.println("Coefficients: "
+ lrModel.coefficients() + " Intercept: " + lrModel.intercept());
// We can also use the multinomial family for binary classification
LogisticRegression mlr = new LogisticRegression()
.setMaxIter(10)
.setRegParam(0.3)
.setElasticNetParam(0.8)
.setFamily("multinomial");
// Fit the model
LogisticRegressionModel mlrModel = mlr.fit(training);
// Print the coefficients and intercepts for logistic regression with multinomial family
System.out.println("Multinomial coefficients: " + lrModel.coefficientMatrix()
+ "\nMultinomial intercepts: " + mlrModel.interceptVector());
// $example off$
spark.stop();}}
你知道我错了吗?
编辑: 我在IntelliJ上运行它,它是一个Maven项目,我添加了依赖项:
Exception in thread "main" java.lang.AssertionError: assertion failed: unsafe symbol CompatContext (child of package macrocompat) in runtime reflection universe
at scala.reflect.internal.Symbols$Symbol.<init>(Symbols.scala:184)
at scala.reflect.internal.Symbols$TypeSymbol.<init>(Symbols.scala:2984)
at scala.reflect.internal.Symbols$ClassSymbol.<init>(Symbols.scala:3176)
at scala.reflect.internal.Symbols$StubClassSymbol.<init>(Symbols.scala:3471)
at scala.reflect.internal.Symbols$Symbol.newStubSymbol(Symbols.scala:498)
at scala.reflect.internal.pickling.UnPickler$Scan.readExtSymbol$1(UnPickler.scala:258)
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbol(UnPickler.scala:284)
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbolRef(UnPickler.scala:649)
at scala.reflect.internal.pickling.UnPickler$Scan.readType(UnPickler.scala:417)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725)
at scala.reflect.internal.pickling.UnPickler$Scan.at(UnPickler.scala:179)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.completeInternal(UnPickler.scala:725)
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.complete(UnPickler.scala:749)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1489)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:162)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:162)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.info(SynchronizedSymbols.scala:162)
at scala.reflect.internal.Mirrors$RootsBase.ensureClassSymbol(Mirrors.scala:94)
at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
at scala.reflect.internal.Mirrors$RootsBase.getClassIfDefined(Mirrors.scala:114)
at scala.reflect.internal.Mirrors$RootsBase.getClassIfDefined(Mirrors.scala:111)
at scala.reflect.internal.Definitions$DefinitionsClass.BlackboxContextClass$lzycompute(Definitions.scala:496)
at scala.reflect.internal.Definitions$DefinitionsClass.BlackboxContextClass(Definitions.scala:496)
at scala.reflect.runtime.JavaUniverseForce$class.force(JavaUniverseForce.scala:305)
at scala.reflect.runtime.JavaUniverse.force(JavaUniverse.scala:16)
at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:147)
at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:78)
at scala.reflect.runtime.package$.universe$lzycompute(package.scala:17)
at scala.reflect.runtime.package$.universe(package.scala:17)
at org.apache.spark.sql.catalyst.ScalaReflection$.<init>(ScalaReflection.scala:40)
at org.apache.spark.sql.catalyst.ScalaReflection$.<clinit>(ScalaReflection.scala)
at org.apache.spark.sql.catalyst.encoders.RowEncoder$.org$apache$spark$sql$catalyst$encoders$RowEncoder$$serializerFor(RowEncoder.scala:74)
at org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(RowEncoder.scala:61)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:415)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:172)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:156)
at GettingStarted.main(GettingStarted.java:95)
答案 0 :(得分:2)
tl; dr 一旦您开始看到scala内部的错误,提及反射世界,请考虑不兼容的scala版本。
你的libs上的scala版本彼此不匹配(2.10和2.11)。
您应该在实际的scala版本上对齐所有内容。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId> <!-- This is scala v2.11 -->
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId> <!-- This is scala v2.10 -->
<version>2.2.0</version>
</dependency>