我尝试在Flink Java中创建一个两行的滚动时间窗口。这必须基于dateTime(TimeStamp3数据类型)或unixDateTime(BIGINT数据类型)列。我添加了两个不同代码版本的代码。我把错误放在代码上方。
当我打印
|-- mID: INT
|-- dateTime: TIMESTAMP(3) *ROWTIME*
|-- mValue: DOUBLE
|-- unixDateTime: BIGINT
|-- mType: STRING
对象的数据类型时,我看到:StreamExecutionEnvironment fsEnv = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(fsEnv);
fsEnv.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
TupleTypeInfo<Tuple5<Integer, Timestamp, Double, Long, String>> tupleType = new TupleTypeInfo<>(
Types.INT(),
Types.SQL_TIMESTAMP(),
Types.DOUBLE(),
Types.LONG(),
Types.STRING());
DataStream<Tuple5<Integer, Timestamp, Double, Long, String>> dsTuple =
tableEnv.toAppendStream(HTable, tupleType);
//When I run below code I get this error: Caused by: java.lang.RuntimeException: Rowtime timestamp is null. Please make sure that a proper TimestampAssigner is defined and the stream environment uses the EventTime time characteristic.
Table table = tableEnv.fromDataStream(dsTuple, "mID, dateTime.rowtime, mValue, unixDateTime, mType");
DataStream<Row> stream = tableEnv.toAppendStream(table, Row.class);
stream.print();
//When I run below code I get this error: Exception in thread "main" java.lang.UnsupportedOperationException: Event-time grouping windows on row intervals are currently not supported.
Table table = tableEnv.fromDataStream(dsTuple, "mID, dateTime.rowtime, measurementValue, unixDateTime, measurementType")
.window(Tumble.over("2.rows")
.on("dateTime")
.as("a"))
.groupBy("a")
.select("AVG(mValue)");
DataStream<Row> stream = tableEnv.toAppendStream(table, Row.class);
stream.print();
export const authLogin = (username, password) => {
return dispatch => {
dispatch(authStart());
axios.post('http://localhost:8000/rest-auth/login/', {
username: username,
password: password
},)
.then(res => {
const token = res.data.key;
const user = username;
const expirationDate = new Date(new Date().getTime() + 3600 * 1000);
localStorage.setItem('token', token);
localStorage.setItem('user', user);
localStorage.setItem('expirationDate', expirationDate);
dispatch(authSuccess(token));
dispatch(checkAuthTimeout(3600));
})
.catch(err => {
dispatch(authFail(err))
})
}
}
const authSuccess = (state, action) => {
return updateObject(state, {
token: action.token,
error: null,
loading: false
});
}
答案 0 :(得分:1)
在流表上基于时间的操作要求您明确告知Flink如何处理时间。您将要阅读relevant section of the documentation。
您将要特别注意event time上的部分。