加载Avro文件以创建Dataframe时出现StackOverflowError

时间:2017-01-31 08:06:30

标签: apache-spark-sql avro spark-avro

我在尝试加载Avro文件(大小为134 KB)时遇到此错误。我的pom依赖项如下。我正在从protobuf消息中创建这个Avro,它可以正常工作。

pom依赖项:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-protobuf</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>3.0.0</version>
</dependency>

例外:

Exception in thread "main" java.lang.StackOverflowError
    at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)
    at scala.collection.Iterator$class.exists(Iterator.scala:919)
    at scala.collection.AbstractIterator.exists(Iterator.scala:1336)
    at scala.collection.IterableLike$class.exists(IterableLike.scala:77)
    at scala.collection.AbstractIterable.exists(Iterable.scala:54)
    at com.databricks.spark.avro.SchemaConverters$.toSqlType(SchemaConverters.scala:75)
    at com.databricks.spark.avro.SchemaConverters$$anonfun$1.apply(SchemaConverters.scala:56)
    at com.databricks.spark.avro.SchemaConverters$$anonfun$1.apply(SchemaConverters.scala:55)

0 个答案:

没有答案