如何修复R中关于spakly模型的错误

时间:2019-03-31 08:56:23

标签: r sparklyr

可以帮助我解决R闪闪发亮的错误

kmeans_model <- iris_tbl %>%
   select(Petal_Width, Petal_Length) %>%
   ml_kmeans(centers = 3)
  

错误:java.lang.IllegalArgumentException:字段“功能”不正确   存在。可用字段:Petal_Width,Petal_Length

    at org.apache.spark.sql.types.StructType$$anonfun$apply$1.apply(StructType.scala:274)

    at org.apache.spark.sql.types.StructType$$anonfun$apply$1.apply(StructType.scala:274)

    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)

    at scala.collection.AbstractMap.getOrElse(Map.scala:59)

    at org.apache.spark.sql.types.StructType.apply(StructType.scala:273)

    at org.apache.spark.ml.util.SchemaUtils$.checkColumnTypes(SchemaUtils.scala:58)

    at org.apache.spark.ml.util.SchemaUtils$.validateVectorCompatibleColumn(SchemaUtils.scala:119)

    at org.apache.spark.ml.clustering.KMeansParams$class.validateAndTransformSchema(KMeans.scala:96)

    at org.apache.spark.ml.clustering.KMeans.validateAndTransformSchema(KMeans.scala:285)

    at org.apache.spark.ml.clustering.KMeans.transformSchema(KMeans.scala:382)

    at org.apache.spark.ml.PipelineStage.transformSchema(Pipeline.scala:74)

    at org.apache.spark.ml.clustering.KMeans$$anonfun$fit$1.apply(KMeans.scala:341)

    at org.apache.spark.ml.clustering.KMeans$$anonfun$fit$1.apply(KMeans.scala:340)

    at org.apache.spark.ml.util.Instrumentation$$anonfun$11.apply(Instrumentation.scala:183)

    at scala.util.Try$.apply(Try.scala:192)

    at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:183)

    at org.apache.spark.ml.clustering.KMeans.fit(KMeans.scala:340)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

    at java.lang.reflect.Method.invoke(Unknown Source)

    at sparklyr.Invoke.invoke(invoke.scala:139)

    at sparklyr.StreamHandler.handleMethodCall(stream.scala:123)

    at sparklyr.StreamHandler.read(stream.scala:66)

    at sparklyr.BackendHandler.channelRead0(handler.scala:51)

    at sparklyr.BackendHandler.channelRead0(handler.scala:4)

    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)

    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)

    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)

    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)

    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310)

    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)

    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)

    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)

    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)

    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)

    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)

    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)

    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)

    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)

    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)

    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)

    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)

    at java.lang.Thread.run(Unknown Source)
     

警告:...的某些组件未使用:中心

我已经尝试过使用其他功能,但是对于3个集群并仅拾取2个集群并没有作用

kmeans_model <- iris_tbl %>% 
ml_kmeans(formula= ~ Petal_Width + Petal_Length, centers = 3)

#Warning: Some components of ... were not used: centers

print(kmeans_model)
#K-means clustering with 2 clusters
#
#Cluster centers:
#  Petal_Width Petal_Length
#1   1.6818182     4.925253
#2   0.2627451     1.492157
#
#Within Set Sum of Squared Errors =  86.39022>

1 个答案:

答案 0 :(得分:0)

错误的第一行很简单:

  

字段“功能”不存在。

如果您查看?ml_kmeans的文档,您会发现您需要指定公式(第二次尝试)或features_col。现在,模型的Spark功能中的qucik注释应该在data.frame

的一列内进行矢量化

您的第二个错误/警告消息也很简单:

  

警告:...的某些组件未使用:中心

centers不是ml_kmeans中的参数。您要使用的是k

kmeans_model <- iris_tbl %>% 
       ml_kmeans(formula= ~ Petal_Width + Petal_Length, k = 3)

kmeans_model
# K-means clustering with 3 clusters
# 
# Cluster centers:
#   Petal_Width Petal_Length
# 1    1.359259     4.292593
# 2    0.246000     1.462000
# 3    2.047826     5.626087
# 
# Within Set Sum of Squared Errors =  31.41289

要在没有公式的情况下运行,您需要使用ft_vector_assembler

kmeans_model <- iris_tbl %>% 
  ft_vector_assembler(input_cols=c("Sepal_Width","Petal_Length"), output_col="features") %>%
  ml_kmeans(k = 3)