PyFlink 1.15 Documentation
import Types from pyflink.datastream import StreamExecutionEnvironment from pyflink.table import StreamTableEnvironment # create a StreamExecutionEnvironment which is the entry point of `DataStream` program program. env = StreamExecutionEnvironment.get_execution_environment() t_env = StreamTableEnvironment.create(env) ds = env.from_collection([(1, 'Hi'), (2, 'Hello')], type_info=Types.ROW_NAMED( ["id", "data"] you can refer to the latest version of PyFlink DataStream API doc StreamExecutionEnvironment Creation StreamExecutionEnvironment is the entry point and central concept for creating DataStream API programs0 码力 | 36 页 | 266.77 KB | 1 年前3PyFlink 1.16 Documentation
import Types from pyflink.datastream import StreamExecutionEnvironment from pyflink.table import StreamTableEnvironment # create a StreamExecutionEnvironment which is the entry point of `DataStream` program program. env = StreamExecutionEnvironment.get_execution_environment() t_env = StreamTableEnvironment.create(env) ds = env.from_collection([(1, 'Hi'), (2, 'Hello')], type_info=Types.ROW_NAMED( ["id", "data"] you can refer to the latest version of PyFlink DataStream API doc StreamExecutionEnvironment Creation StreamExecutionEnvironment is the entry point and central concept for creating DataStream API programs0 码力 | 36 页 | 266.80 KB | 1 年前3Introduction to Apache Flink and Apache Kafka - CS 591 K1: Data Stream Processing and Analytics Spring 2020
Double) object MaxSensorReadings { def main(args: Array[String]) { val env = StreamExecutionEnvironment.getExecutionEnvironment val sensorData = env.addSource(new SensorSource) val Double) object MaxSensorReadings { def main(args: Array[String]) { val env = StreamExecutionEnvironment.getExecutionEnvironment val sensorData = env.addSource(new SensorSource) val Double) object MaxSensorReadings { def main(args: Array[String]) { val env = StreamExecutionEnvironment.getExecutionEnvironment val sensorData = env.addSource(new SensorSource) val0 码力 | 26 页 | 3.33 MB | 1 年前3State management - CS 591 K1: Data Stream Processing and Analytics Spring 2020
state.checkpoints.dir: path/to/checkpoint/folder/ In your Flink program: val env = StreamExecutionEnvironment.getExecutionEnvironment val checkpointPath: String = ??? // configure path for checkpoints processes. 19 Keyed state scope Vasiliki Kalavri | Boston University 2020 StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); env.setStreamTimeCharacteristic(TimeCharacteristic0 码力 | 24 页 | 914.13 KB | 1 年前3Windows and triggers - CS 591 K1: Data Stream Processing and Analytics Spring 2020
University 2020 object MaxSensorReadings { def main(args: Array[String]) { val env = StreamExecutionEnvironment.getExecutionEnvironment val sensorData = env.addSource(new SensorSource) define time when you are creating windows. The time characteristic is a property of the StreamExecutionEnvironment: Configuring a time characteristic 4 object AverageSensorReadings { def main(args: main(args: Array[String]) { // set up the streaming execution environment val env = StreamExecutionEnvironment.getExecutionEnvironment // use event time for the application env.setStreamTime0 码力 | 35 页 | 444.84 KB | 1 年前3Notions of time and progress - CS 591 K1: Data Stream Processing and Analytics Spring 2020
if the stream contains special records that encode watermark information. val env = StreamExecutionEnvironment.getExecutionEnvironment env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime) // record timestamp r.timestamp } } Vasiliki Kalavri | Boston University 2020 val env = StreamExecutionEnvironment.getExecutionEnvironment // set the event time characteristic env.setStreamTimeCharact0 码力 | 22 页 | 2.22 MB | 1 年前3Streaming in Apache Flink
• Event time • Ingestion time • Processing time (default) final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); env.setStreamTimeCharacteristic(TimeCharacteristic0 码力 | 45 页 | 3.00 MB | 1 年前3Fault-tolerance demo & reconfiguration - CS 591 K1: Data Stream Processing and Analytics Spring 2020
Kalavri | Boston University 2020 32 ??? Vasiliki Kalavri | Boston University 2020 val env = StreamExecutionEnvironment.getExecutionEnvironment // set the maximum parallelism for this application env.setMaxParallelism(512)0 码力 | 41 页 | 4.09 MB | 1 年前3Stream processing fundamentals - CS 591 K1: Data Stream Processing and Analytics Spring 2020
temp: Double) object MaxSensorReadings { def main(args: Array[String]) { val env = StreamExecutionEnvironment.getExecutionEnvironment val sensorData = env.addSource(new SensorSource) val0 码力 | 45 页 | 1.22 MB | 1 年前3Streaming optimizations - CS 591 K1: Data Stream Processing and Analytics Spring 2020
optimizations ??? Vasiliki Kalavri | Boston University 2020 Task chaining: Fusion in Flink 31 StreamExecutionEnvironment .disableOperatorChaining() val input: DataStream[X] = ... val result: DataStream[Y] =0 码力 | 54 页 | 2.83 MB | 1 年前3
共 11 条
- 1
- 2