当我运行批处理 table.toDataset.print 时,出现错误

时间:2021-05-02 07:04:54

标签: apache-flink

我正在使用 Flink 1.12,我有以下简单代码,我想从文件系统注册一个表,然后将其转换为数据集,然后将其打印到控制台。但是我运行该应用程序时,抛出异常如下:

Exception in thread "main" org.apache.flink.table.api.TableException: findAndCreateTableSource failed.
    at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:49)
    at org.apache.flink.table.catalog.DatabaseCalciteSchema.convertCatalogTable(DatabaseCalciteSchema.java:164)
    at org.apache.flink.table.catalog.DatabaseCalciteSchema.convertTable(DatabaseCalciteSchema.java:107)
    at org.apache.flink.table.catalog.DatabaseCalciteSchema.lambda$getTable$0(DatabaseCalciteSchema.java:91)
    at java.util.Optional.map(Optional.java:215)
    at org.apache.flink.table.catalog.DatabaseCalciteSchema.getTable(DatabaseCalciteSchema.java:82)
    at org.apache.calcite.jdbc.SimpleCalciteSchema.getImplicitTable(SimpleCalciteSchema.java:83)
    at org.apache.calcite.jdbc.CalciteSchema.getTable(CalciteSchema.java:289)
    at org.apache.calcite.sql.validate.SqlValidatorUtil.getTableEntryFrom(SqlValidatorUtil.java:1059)
    at org.apache.calcite.sql.validate.SqlValidatorUtil.getTableEntry(SqlValidatorUtil.java:1016)
    at org.apache.calcite.prepare.CalciteCatalogReader.getTable(CalciteCatalogReader.java:119)
    at org.apache.calcite.prepare.CalciteCatalogReader.getTableForMember(CalciteCatalogReader.java:229)
    at org.apache.calcite.prepare.CalciteCatalogReader.getTableForMember(CalciteCatalogReader.java:83)
    at org.apache.calcite.tools.RelBuilder.scan(RelBuilder.java:1094)
    at org.apache.calcite.tools.RelBuilder.scan(RelBuilder.java:1123)
    at org.apache.flink.table.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:282)
    at org.apache.flink.table.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:140)
    at org.apache.flink.table.operations.CatalogQueryOperation.accept(CatalogQueryOperation.java:69)
    at org.apache.flink.table.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:137)
    at org.apache.flink.table.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:117)
    at org.apache.flink.table.operations.utils.QueryOperationDefaultVisitor.visit(QueryOperationDefaultVisitor.java:92)
    at org.apache.flink.table.operations.CatalogQueryOperation.accept(CatalogQueryOperation.java:69)
    at org.apache.flink.table.calcite.FlinkRelBuilder.tableOperation(FlinkRelBuilder.scala:121)
    at org.apache.flink.table.api.internal.BatchTableEnvImpl.translate(BatchTableEnvImpl.scala:553)
    at org.apache.flink.table.api.internal.BatchTableEnvImpl.translate(BatchTableEnvImpl.scala:537)
    at org.apache.flink.table.api.bridge.scala.internal.BatchTableEnvironmentImpl.toDataSet(BatchTableEnvironmentImpl.scala:70)
    at org.apache.flink.table.api.bridge.scala.TableConversions.toDataSet(TableConversions.scala:53)
    at org.example.dataset.DatasetTest$.main(DatasetTest.scala:31)
    at org.example.dataset.DatasetTest.main(DatasetTest.scala)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in
the classpath.

Reason: Required context properties mismatch.

The following properties are requested:
connector=filesystem
format=csv
path=D:/projects/openprojects3/learn.flink.ioc/data/stock.csv.copy
schema.0.data-type=VARCHAR(2147483647)
schema.0.name=x
schema.1.data-type=VARCHAR(2147483647)
schema.1.name=y

The following factories have been considered:
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactory
org.apache.flink.connector.jdbc.table.JdbcTableSourceSinkFactory
    at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)
    at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)
    at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)
    at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:96)
    at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:46)
    ... 28 more

应用代码为:

import org.apache.flink.api.scala._
import org.apache.flink.table.api.FieldExpression
import org.apache.flink.table.api.bridge.scala._

object DatasetTest {
  def main(args: Array[String]): Unit = {
    val env = ExecutionEnvironment.getExecutionEnvironment

    env.setParallelism(1)

    val tenv = BatchTableEnvironment.create(env)

    val ddl2 =
      """
      create table sourceTable(
      x STRING,
      y STRING
      ) with (
        'connector' = 'filesystem',
        'path' = 'D:/stock.csv',
        'format' = 'csv'
      )
      """.stripMargin(' ')

    tenv.executeSql(ddl2)


    tenv.from("sourceTable").toDataSet[(String, String)].print()


    env.execute()
  }
}

0 个答案:

没有答案