是否可以在cql脚本中的cql命令中使用变量?

时间:2015-05-21 12:31:57

标签: cassandra cql cqlsh

在CQL脚本中使用时,有没有办法在CQL命令中传递变量,如:

select * from "Column Family Name" where "ColumnName"='A variable which takes different values';

欢迎任何建议。

2 个答案:

答案 0 :(得分:5)

不,CQL真的没有办法定义变量,运行循环,并根据这些变量更新/查询。

作为替代方案,我通常使用DataStax Python driver来完成这样的简单任务/脚本。以下是Python脚本的摘录,我用了一段时间来填充CSV文件中的产品颜色。

# connect to Cassandra
auth_provider = PlainTextAuthProvider(username='username', password='currentHorseBatteryStaple')
cluster = Cluster(['127.0.0.1'], auth_provider=auth_provider)
session = cluster.connect('products')

# prepare statements
preparedUpdate = session.prepare(
    """
        UPDATE products.productsByItemID SET color=? WHERE itemid=? AND productid=?;
    """
)
# end prepare statements

counter = 0

# read csv file
dataFile = csv.DictReader(csvfilename, delimiter=',')
for csvRow in dataFile:
    itemid = csvRow['itemid']
    color = csvRow['customcolor']
    productid = csvRow['productid']

    #update product color
    session.execute(preparedUpdate,[color,itemid,productid])

    counter = counter + 1

# close Cassandra connection
session.cluster.shutdown()
session.shutdown()

print "updated %d colors" % (counter)

有关详细信息,请查看DataStax教程Getting Started with Apache Cassandra and Python

答案 1 :(得分:0)

是的,您可以通过以下方式传递变量:

import com.datastax.spark.connector.{SomeColumns, _}

import org.apache.spark.{SparkConf, SparkContext}

import com.datastax.spark.connector.cql.CassandraConnector

import org.apache.spark.SparkConf

import com.datastax.spark.connector

import com.datastax.spark.connector._

import org.apache.spark.{Logging, SparkConf}

import org.apache.spark.sql.DataFrame

import org.apache.spark.sql.{Row, SQLContext, DataFrame}

import org.apache.spark.sql.cassandra._

val myvar=1

csc.setKeyspace("test_keyspace")

val query="""select a.col1, c.col4, b.col2 from test_keyspace.table1 a inner join test_keyspace.table2 b on a.col1=b.col2 inner join  test_keyspace.table3 c on b.col3=c.col4  where a.col1="""+myvar.toString

val results=csc.sql(query)

results.show()