如何使用scala + spark连接器在salesforce中形成select * from query?

时间:2018-02-23 10:03:44

标签: scala apache-spark salesforce

这是有效的:

val soql = "select id, name, amount from opportunity"   

val sfDF  = spark.read.format("com.springml.spark.salesforce")
         .option("username", "*******")
         .option("password", "***********")
         .option("soql", soql)
         .option("version", "37.0")
         .load()

但我正在形成select * from opportunity它无效,抛出了格式错误的查询错误异常。

我得到以下异常:

18/02/23 11:56:40 WARN ForceAPIImpl:执行salesforce查询时出错 java.lang.Exception:从%20opportunity访问https://ap5.salesforce.com/services/data/v37.0/query?q=select%20 *%20失败。状态400.原因不良请求  来自服务器的错误[{"消息":" \ n选择*来自机会\ n ^ \ nERROR在行:1:列:7 \意外令牌:' *'& #34;"的errorCode":" MALFORMED_QUERY"}]         在com.springml.salesforce.wave.util.HTTPHelper.execute(HTTPHelper.java:102)         在com.springml.salesforce.wave.util.HTTPHelper.get(HTTPHelper.java:75)         在com.springml.salesforce.wave.util.HTTPHelper.get(HTTPHelper.java:79)

3 个答案:

答案 0 :(得分:0)

SOQL不支持select *您必须指定要选择的字段。如果要动态构建select *的等价物,可以使用describeSObject API调用来确定所有字段名称,并从中构建查询。

答案 1 :(得分:0)

enter code here:val config = new ConnectorConfig();
config.setUsername(username);
config.setPassword(password);
val authEndpoint = ("https://login.salesforce.com/services/Soap/u/22.0");
config.setAuthEndpoint(authEndpoint);
config.setServiceEndpoint(authEndpoint);
val conn = new PartnerConnection(config);    
    var connection = conn.describeGlobal()
    // Get the sObjects from the describe global result
    val sobjectResults = connection.getSobjects()
   var res=conn.describeSObject("Opportunity")
   var fields = res.getFields().toBuffer;
   println("Has " + fields.length + " fields");
   var strFields=""
   // Iterate through each field and gets its properties       
  // fields.foreach{x=> //println("filed name is"+x.getLabel);println("filed name is "+x.getName())}

  fields.foreach{x=> //println("filed name is "+x.getName())
     if(x == null || x == "" || x==" ")
     {
       strFields = x.getName;
       }else{
         strFields = strFields + "," + x.getName;}

}

var qry =“select”+ strFields.toString.replaceFirst(“,”,“”)+“from Opportunity”;

println(“查询是::::::::::”+ qry);

var sfDF = Spark.read.format(“com.springml.spark.salesforce”)。option(“username”,username).option(“password”,password).option(“soql”,qry)。负载()

答案 2 :(得分:0)

Java实现:

import com.sforce.soap.partner.DescribeSObjectResult;
import com.sforce.soap.partner.Field;
import com.sforce.soap.partner.PartnerConnection;
import com.sforce.ws.ConnectorConfig;

public class SfdcGetColumns {

    public static void main(String[] args) throws Exception
    {

        String endpoint = "https://login.salesforce.com/services/Soap/u/22.0";

        ConnectorConfig config = new ConnectorConfig();
        config.setUsername(username);
        config.setPassword(password);
        config.setAuthEndpoint(endpoint);
        config.setServiceEndpoint(endpoint);
        PartnerConnection conn = new PartnerConnection(config);    

        // Get the sObjects from the describe global result
        DescribeSObjectResult res = conn.describeSObject(args[0]);      
        Field[] fields = res.getFields();
        System.out.println("=======Has " + fields.length + " fields");
        System.out.println("==========FIELDS========");

        for(int i=0; i<fields.length; i++)
        {
            String strFieldNm = fields[i].getName();
            System.out.print(strFieldNm+",");
        }

    }

}

POM.xml:

<dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.3.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.11.8</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.springml</groupId>
            <artifactId>spark-salesforce_2.11</artifactId>
            <version>1.1.0</version>
        </dependency>
    </dependencies>

您需要在spark-submit中以--deploy-mode client的身份运行