org.hibernate.QueryException:参数前缀“:”后不允许有空格

时间:2019-01-09 18:47:14

标签: spring hibernate jpa native-sql

我正在尝试执行查询,但是得到了      由处理程序执行引起的已解决异常:org.springframework.dao.InvalidDataAccessApiUsageException:org.hibernate.QueryException:参数前缀“:”后不允许有空格。 我按照这里的建议做 How can I use MySQL assign operator(:=) in hibernate native query? 和这里 : Hibernate exception on encountering mysql := operator但相同。 休眠版本5.2.17.Final

ClientRepository.java

I have been trying different version of scala 2.11.x to resolve this but none of them worked, I someone can help me to know how to resolve these kind of issues, it will be great. 

    <properties>
        <spark.version>2.3.2</spark.version>
        <aws.sdk.version>1.10.62</aws.sdk.version>
        <hadoop.version>2.7.2</hadoop.version>
    </properties>

    <pluginRepositories>
        <pluginRepository>
            <id>scala</id>
            <name>Scala Tools</name>
            <url>http://scala-tools.org/repo-releases/</url>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
        </pluginRepository>
    </pluginRepositories>

    <repositories>
        <repository>
            <id>scala-tools.org</id>
            <name>Scala-tools Maven2 Repository</name>
            <url>http://scala-tools.org/repo-releases</url>
        </repository>
    </repositories>

    <dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.11.8</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.scalatest/scalatest_2.10 -->
        <dependency>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest_2.11</artifactId>
            <version>2.1.3</version>
        </dependency>

        <!-- SPARK -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>com.frugalmechanic</groupId>
            <artifactId>scala-optparse_2.11</artifactId>
            <version>1.1.2</version>
        </dependency>


        <!-- Spark AVRO -->
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-avro_2.11</artifactId>
            <version>3.2.0</version>
        </dependency>

        <!-- SPARK REDSHIFT -->
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-redshift_2.11</artifactId>
            <version>2.0.0</version>
        </dependency>

        <!-- SPARK CSV -->
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-csv_2.11</artifactId>
            <version>1.0.1</version>
        </dependency>

        <!-- HADOOP -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-distcp</artifactId>
            <version>${hadoop.version}</version>
        </dependency>



[WARNING] The POM forspark-core_2.11:jar:2.3.0 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] The POM for spark-sql_2.11:jar:2.3.0 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] The POM for spark-hive_2.11:jar:2.3.0 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ common ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ common ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-scala- plugin:2.15.2:compile (default) @ common ---
[INFO] Checking for multiple versions of scala
[WARNING] Invalid POM for spark-core_2.11:jar:2.3.0, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] Invalid POM for spark-sql_2.11:jar:2.3.0, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] Invalid POM for spark-hive_2.11:jar:2.3.0, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING]  Expected all dependencies to require Scala version: 2.11.8
[WARNING]  common:0.1 requires scala version: 2.11.8
[WARNING] Multiple versions of scala libraries detected!
[INFO] excludes = []
Users/amisharma/Documents/target/classes at 1547062946969
[ERROR] error: error while loading package, invalid LOC header (bad signature)
[ERROR] error: missing or invalid dependency detected while loading class file 'package.class'.
[WARNING] warning: Class org.apache.spark.annotation.InterfaceStability not found - continuing with a stub.
[ERROR] error: missing or invalid dependency detected while loading class file 'SQLContext.class'.
[INFO] Could not access term annotation in package org.apache.spark,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[INFO] A full rebuild may help if 'SQLContext.class' was compiled against an incompatible version of org.apache.spark.
[WARNING] warning: Class org.apache.spark.annotation.InterfaceStability not found - continuing with a stub.
[ERROR] error: missing or invalid dependency detected while loading class file 'Dataset.class'.
[INFO] Could not access term annotation in package org.apache.spark,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] A full rebuild may help if 'Dataset.class' was compiled against an incompatible version of org.apache.spark.
[ERROR] error: error while loading Logging, invalid LOC header (bad signature)
[WARNING] warning: Class org.apache.spark.annotation.InterfaceStability not found - continuing with a stub.
[ERROR] error: missing or invalid dependency detected while loading class file 'SQLImplicits.class'.
[INFO] Could not access term annotation in package org.apache.spark,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[INFO] A full rebuild may help if 'SQLImplicits.class' was compiled against an incompatible version of org.apache.spark.
[WARNING] warning: Class org.apache.spark.annotation.InterfaceStability not found - continuing with a stub.
[ERROR] error: missing or invalid dependency detected while loading class file 'ColumnName.class'.
[INFO] Could not access term annotation in package org.apache.spark,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[INFO] A full rebuild may help if 'ColumnName.class' was compiled against an incompatible version of org.apache.spark.
[ERROR] error: error while loading package, invalid LOC header (bad signature)
[ERROR] error: error while loading RDD, invalid LOC header (bad signature)
[ERROR] error: missing or invalid dependency detected while loading class file 'SQLImplicits.class'.
[INFO] Could not access type Encoder in package org.apache.spark.sql,
[INFO] because it (or its dependencies) are missing. Check your build definition for
[INFO] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[INFO] A full rebuild may help if 'SQLImplicits.class' was compiled against an incompatible version of org.apache.spark.sql.
[ERROR] error: missing or invalid dependency detected while loading class file 'LowPrioritySQLImplicits.class'.
[INFO] Could not access type Encoder in package org.apache.spark.sql,

<properties>
    <spark.version>2.3.2</spark.version>
    <aws.sdk.version>1.10.62</aws.sdk.version>
    <hadoop.version>2.7.2</hadoop.version>
</properties>

<pluginRepositories>
    <pluginRepository>
        <id>scala</id>
        <name>Scala Tools</name>
        <url>http://scala-tools.org/repo-releases/</url>
        <releases>
            <enabled>true</enabled>
        </releases>
        <snapshots>
            <enabled>false</enabled>
        </snapshots>
    </pluginRepository>
</pluginRepositories>

<repositories>
    <repository>
        <id>scala-tools.org</id>
        <name>Scala-tools Maven2 Repository</name>
        <url>http://scala-tools.org/repo-releases</url>
    </repository>
</repositories>

<dependencies>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.11.8</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.scalatest/scalatest_2.10 -->
    <dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_2.11</artifactId>
        <version>2.1.3</version>
    </dependency>

    <!-- SPARK -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>

}

1 个答案:

答案 0 :(得分:0)

以前,在本机查询中使用赋值运算符时,Hibernate抛出异常。Hibernate支持转义冒号char而不将其视为参数。 因此,您需要使用反斜杠:“ \\:=

请注意,参考占位符之前和之后不允许有空格。