无法更改spark-cassandra-connector中的身份验证

时间:2016-03-23 15:52:19

标签: apache-spark cassandra apache-spark-sql spark-cassandra-connector

我正在创建一个Spark-Cassandra应用程序(Spark 1.6.0& spark-cassandra-connector 1.6.0-M1),其中我要求多个用户输入他们的Cassandra属性,如主机,用户名,密码,Keyspace ,表和其他。

要动态更改上述属性并从Cassandra表创建数据框,我用Google搜索并找到一些信息

http://www.russellspitzer.com/2016/02/16/Multiple-Clusters-SparkSql-Cassandra/

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/14_data_frames.md#setting-cluster-and-keyspace-level-options

val csc = new CassandraSQLContext(SparkConnection._sc)

csc.setConf(s"${cluster}/spark.cassandra.connection.host", host)
csc.setConf(s"${cluster}/spark.cassandra.connection.port", port)
csc.setConf(s"${cluster}/spark.cassandra.auth.username", username)
csc.setConf(s"${cluster}/spark.cassandra.auth.password", password)

csc.read.format("org.apache.spark.sql.cassandra")
                  .options(Map("cluster" -> cluster, "keyspace" -> keySpace, "table" -> table))
                  .load()

我尝试了提及属性,群集那些不需要身份验证连接成功但是当我尝试使用用户名和安全连接安全群集时密码属性,我收到一些错误。

Exception in thread "Thread-10" java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.17}:9042
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
    at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
    at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
    at com.datastax.spark.connector.rdd.partitioner.CassandraRDDPartitioner$.getTokenFactory(CassandraRDDPartitioner.scala:184)
    at org.apache.spark.sql.cassandra.CassandraSourceRelation$.apply(CassandraSourceRelation.scala:267)
    at org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:57)
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
    at com.bdbizviz.pa.spark.util.ServiceUtil$.readData(ServiceUtil.scala:97)
    at com.bdbizviz.pa.spark.services.SparkServices$$anon$1.run(SparkServices.scala:114)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.datastax.driver.core.exceptions.AuthenticationException: Authentication error on host /192.168.1.17:9042: Host /192.168.1.17:9042 requires authentication, but no authenticator found in Cluster configuration
    at com.datastax.driver.core.AuthProvider$1.newAuthenticator(AuthProvider.java:40)
    at com.datastax.driver.core.Connection$5.apply(Connection.java:250)
    at com.datastax.driver.core.Connection$5.apply(Connection.java:234)
    at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:861)
    at com.google.common.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:297)
    at com.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
    at com.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145)
    at com.google.common.util.concurrent.AbstractFuture.set(AbstractFuture.java:185)
    at com.datastax.driver.core.Connection$Future.onSet(Connection.java:1174)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1005)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:928)
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:831)
    at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:346)
    at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:254)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    ... 1 more

1 个答案:

答案 0 :(得分:0)

配置键也需要包含键空间。他们应该看起来像:

${cluster}:${keyspace}/spark.cassandra.connection.host

如果您查看第二个链接,setCassandraConf函数会将内容转换为。