从Spark SQL导入类型时出错

时间:2015-06-10 19:58:13

标签: scala apache-spark-sql

我正在尝试从spark sql导入类型,如下所示

导入org.apache.spark.sql.types._

但我收到的错误如下:“not found:value DataType”,“not found:type ByteType”

完整的代码是

<div class="row">
    <div class="col-sm-6 col-xs-12 a">
    AAA            
    </div>
    <div class="col-sm-6 col-xs-12 c">
        CCC
    </div>
    <div class="col-sm-6 col-xs-12 c">
        CCC
    </div>
    <div class="col-sm-6 col-xs-12 b">
        BBB
    </div>
</div>

1 个答案:

答案 0 :(得分:3)

ByteType等不是类型,而是单例大小写对象。

所以你可能想要这样的东西:

 def castTo(datum: String, castType: DataType): Any = {                                              
    castType match {                                                                                  
      case DataType.ByteType => datum.toByte                                                                
      case DataType.ShortType => datum.toShort                                                              
      case DataType.IntegerType => datum.toInt                                                              
      case DataType.LongType => datum.toLong                                                                
      case DataType.FloatType => datum.toFloat                                                              
      case DataType.DoubleType => datum.toDouble                                                            
      case DataType.BooleanType => datum.toBoolean                                                          
      case DataType.DecimalType => new BigDecimal(datum.replaceAll(",", ""))                                
      case DataType.TimestampType => Timestamp.valueOf(datum)                                               
      case DataType.DateType => Date.valueOf(datum)                                                         
      case DataType.StringType => datum                                                                     
      case _ => throw new RuntimeException(s"Unsupported type: ${castType.typeName}")                 
    }                                                                                                 
  }

(至少在我的Spark版本中,没有DecimalType,而且castType没有typeName字段)