我正在Apache Flink中编写流媒体服务。我基本上是使用org.apache.flink.table.sources.CsvTableSource从CSV文件中提取数据。 以下是相同的代码:
StreamTableEnvironment streamTableEnvironment = TableEnvironment
.getTableEnvironment(streamExecutionEnvironment);
CsvTableSource csvTableSource = CsvTableSource.builder().path(pathToCsvFile)
.field("XXX0", Types.SQL_TIMESTAMP).field("XXX1", Types.INT)
.field("XXX2", Types.DECIMAL).field("XXX3", Types.INT).field("XXX4", Types.INT)
.field("XXX9", Types.DECIMAL).field("XXX5", Types.STRING)
.field("XXX6", Types.STRING).field("XXX7", Types.STRING).fieldDelimiter(",").lineDelimiter("\n")
.ignoreFirstLine().ignoreParseErrors().build();
streamTableEnvironment.registerTableSource("metrics_table", csvTableSource);
Table selectedMetricTable = streamTableEnvironment.sqlQuery(getSQLQuery(metricsType, metricsGroupingLevel));
DataStream<Tuple2<Boolean, MetricsTimeSeriesData>> metricStream = streamTableEnvironment
.toRetractStream(selectedMetricTable, MetricsTimeSeriesData.class);
但它给出了以下错误:
Caused by: java.lang.ClassNotFoundException: org.apache.flink.table.sources.TableSource
以下是maven依赖项:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table_2.11</artifactId>
<version>1.4.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_2.11</artifactId>
<version>1.4.0</version>
</dependency>
我可以看到上面类的源定义,但我仍然收到此错误。请帮帮忙?
答案 0 :(得分:1)
模块flink-table
未附带flink二进制分发版,因此默认情况下不会将其发送到群集。您可以将该依赖项置于集群安装中(在\lib
文件夹中),请参阅setup的最后一部分,或者您可以将该作业作为uber-jar提交,并将该依赖项打包,请参阅here
答案 1 :(得分:0)
我正在使用Flink 1.8.0版本,我遇到了同样的问题。我可以通过在系统路径中指向 flink-table_2.12-1.8.0.jar 在 pom.xml 中添加以下依赖项来解决此问题。
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table_2.12</artifactId>
<version>1.8.0</version>
<scope>system</scope>
<systemPath>E:\flink-1.8.0-scala_2.12\opt\flink-table_2.12-1.8.0.jar</systemPath>
</dependency>
希望它会对您有所帮助。