我正在发推文
396124436845178880,"When's 12.4k gonna roll around",Matty_T_03
396124437168537600,"I really wish I didn't give up everything I did for you. I'm so mad at my self for even letting it get as far as it did.",savava143
396124436958412800,"I really need to double check who I'm sending my snapchats to before sending it ",juliannpham
396124437218885632,"@Darrin_myers30 I feel you man, gotta stay prayed up. Year is important",Ful_of_Ambition
396124437558611968,"tell me what I did in my life to deserve this.",_ItsNotBragging
396124437499502592,"Too many fine men out here...see me drooling",LolaofLife
396124437722198016,"@jaiclynclausen will do",I_harley99
我试图在将文件读入RDD后替换所有特殊字符,
val fileReadRdd = sc.textFile(fileInput)
val fileReadRdd2 = fileReadRdd.map(x => x.map(_.replace(","," ")))
val fileFlat = fileReadRdd.flatMap(rec => rec.split(" "))
我收到以下错误
Error:(41, 57) value replace is not a member of Char
val fileReadRdd2 = fileReadRdd.map(x => x.map(_.replace(",","")))
答案 0 :(得分:3)
我怀疑:
x => x.map(_.replace(",",""))
将您的字符串视为一系列字符,而您实际上想要
x => x.replace(",", "")
(即你不需要映射字符的'序列')
答案 1 :(得分:0)
在任何受Spark支持的文件系统上,Spark Scala中的常规文件系统中的Perl的oneliner perl -pi 's/\s+//' $file
如下所示(随意调整正则表达式):
// read the file into rdd of strings
val rdd: RDD[String] = spark.sparkContext.textFile(uri)
// for each line in rdd apply pattern and save to file
rdd
.map(line => line.replaceAll("^\\s+", ""))
.saveAsTextFile(uri + ".tmp")