SparkR:向Spark数据框

时间:2015-12-29 05:25:50

标签: sparkr

我正在尝试将一些计算列添加到SparkR数据框中,如下所示:

Orders <- withColumn(Orders, "Ready.minus.In.mins",   
(unix_timestamp(Orders$ReadyTime) - unix_timestamp(Orders$InTime)) / 60)
Orders <- withColumn(Orders, "Out.minus.In.mins", 
(unix_timestamp(Orders$OutTime) - unix_timestamp(Orders$InTime)) / 60)

第一个命令执行正常,head(Orders)显示新列。第二个命令抛出错误:

15/12/29 05:10:02 ERROR RBackendHandler: col on 359 failed
Error in select(x, x$"*", alias(col, colName)) : 
error in evaluating the argument 'col' in selecting a method for function 
'select': Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) :
org.apache.spark.sql.AnalysisException: Cannot resolve column name 
"Ready.minus.In.mins" among (ASAP, AddressLine, BasketCount, CustomerEmail, CustomerID, CustomerName, CustomerPhone, DPOSCustomerID, DPOSOrderID, ImportedFromOldDb, InTime, IsOnlineOrder, LineItemTotal, NetTenderedAmount, OrderDate, OrderID, OutTime, Postcode, ReadyTime, SnapshotID, StoreID, Suburb, TakenBy, TenderType, TenderedAmount, TransactionStatus, TransactionType, hasLineItems, Ready.minus.In.mins);
at org.apache.spark.sql.DataFrame$$anonfun$resolve$1.apply(DataFrame.scala:159)
at org.apache.spark.sql.DataFrame$$anonfun$resolve$1.apply(DataFrame.scala:159)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.DataFrame.resolve(DataFrame.scala:158)
at org.apache.spark.sql.DataFrame$$anonfun$col$1.apply(DataFrame.scala:650)
at org.apa 

在添加新列之前,我是否需要对数据框执行某些操作才能接受另一个?

2 个答案:

答案 0 :(得分:1)

从链接中,只需在访问列时使用后挡板,例如:

使用

df['Fields.fields1']

或者其他什么,使用:

df['`Fields.fields1`']

答案 1 :(得分:0)

在此处找到:spark-issues mailing list archives

SparkR并不完全满意&#34;。&#34;在列名中。