```java
public class CountWindowAll {
private static final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
private static final DataStreamSource<String> stream = env.socketTextStream("192.168.8.111", 8888);
public static void main(String[] args) throws Exception {
SingleOutputStreamOperator<Integer> mapped = stream.map((MapFunction<String, Integer>) Integer::valueOf).returns(Types.INT);
AllWindowedStream<Integer, GlobalWindow> countWindowAll = mapped.countWindowAll(5);
SingleOutputStreamOperator<Integer> summed = countWindowAll.sum(0);
summed.print();
env.execute("CountWindowAll");
}
}
```
`mapped.countWindowAll(5)` 窗口大小是`5`個數據。
```java
public class CountWindow {
private static final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
private static final DataStreamSource<String> stream = env.socketTextStream("192.168.8.111", 8888);
public static void main(String[] args) throws Exception {
SingleOutputStreamOperator<Tuple2> mapped = stream.map((MapFunction<String, Tuple2>) item -> {
String[] data = item.split(" ");
return Tuple2.of(data[0], Integer.valueOf(data[1]));
}).returns(Types.TUPLE(Types.STRING, Types.INT));
KeyedStream<Tuple2, Tuple> keyed = mapped.keyBy(0);
WindowedStream<Tuple2, Tuple, GlobalWindow> countWindow = keyed.countWindow(5);
SingleOutputStreamOperator<Tuple2> summed = countWindow.sum(1);
summed.print();
env.execute("CountWindow");
}
}
```
分組后,每個組中的數據達到一定條數,`CountWindow`窗口才會被觸發。
- Flink簡介
- flink搭建standalone模式與測試
- flink提交任務(界面方式)
- Flink項目初始化
- Java版WordCount(匿名類)
- Java版WordCount(lambda)
- Scala版WordCount
- Java版WordCount[批處理]
- Scala版WordCount[批處理]
- 流處理非并行的Source
- 流處理可并行的Source
- kafka的Source
- Flink算子(Map,FlatMap,Filter)
- Flink算子KeyBy
- Flink算子Reduce和Max與Min
- addSink自定義Sink
- startNewChain和disableChaining
- 資源槽slotSharingGroup
- 計數窗口
- 滾動窗口
- 滑動窗口
- Session窗口
- 按照EventTime作為標準