Apache Flink

Since Camel 2.18

This documentation page covers the Apache Flink component for the Apache Camel. The camel-flink component provides a bridge between Camel connectors and Flink tasks.

This Camel Flink connector provides a way to route message from various transports, dynamically choosing a flink task to execute, use incoming message as input data for the task and finally deliver the results back to the Camel pipeline.

Maven users will need to add the following dependency to their pom.xml for this component:

<dependency>
    <groupId>org.apache.camel</groupId>
    <artifactId>camel-flink</artifactId>
    <version>x.x.x</version>
    <!-- use the same version as your Camel core version -->
</dependency>

URI Format

Currently, the Flink Component supports only Producers. One can create DataSet, DataStream jobs.

flink:dataset?dataset=#myDataSet&dataSetCallback=#dataSetCallback
flink:datastream?datastream=#myDataStream&dataStreamCallback=#dataStreamCallback

The Apache Flink endpoint is configured using URI syntax:

flink:endpointType

with the following path and query parameters:

Path Parameters (1 parameters):

Name Description Default Type

endpointType

Required Type of the endpoint (dataset, datastream).

EndpointType

Query Parameters (6 parameters):

Name Description Default Type

collect (producer)

Indicates if results should be collected or counted.

true

boolean

dataSet (producer)

DataSet to compute against.

DataSet

dataSetCallback (producer)

Function performing action against a DataSet.

DataSetCallback

dataStream (producer)

DataStream to compute against.

DataStream

dataStreamCallback (producer)

Function performing action against a DataStream.

DataStreamCallback

synchronous (advanced)

Sets whether synchronous processing should be strictly used, or Camel is allowed to use asynchronous processing (if supported).

false

boolean

Spring Boot Auto-Configuration

When using Spring Boot make sure to use the following Maven dependency to have support for auto configuration:

<dependency>
  <groupId>org.apache.camel</groupId>
  <artifactId>camel-flink-starter</artifactId>
  <version>x.x.x</version>
  <!-- use the same version as your Camel core version -->
</dependency>

The component supports 6 options, which are listed below.

Name Description Default Type

camel.component.flink.data-set

DataSet to compute against. The option is a org.apache.flink.api.java.DataSet type.

String

camel.component.flink.data-set-callback

Function performing action against a DataSet. The option is a org.apache.camel.component.flink.DataSetCallback type.

String

camel.component.flink.data-stream

DataStream to compute against. The option is a org.apache.flink.streaming.api.datastream.DataStream type.

String

camel.component.flink.data-stream-callback

Function performing action against a DataStream. The option is a org.apache.camel.component.flink.DataStreamCallback type.

String

camel.component.flink.enabled

Enable flink component

true

Boolean

camel.component.flink.resolve-property-placeholders

Whether the component should resolve property placeholders on itself when starting. Only properties which are of String type can use property placeholders.

true

Boolean

FlinkComponent Options

The Apache Flink component supports 5 options, which are listed below.

Name Description Default Type

dataSet (producer)

DataSet to compute against.

DataSet

dataStream (producer)

DataStream to compute against.

DataStream

dataSetCallback (producer)

Function performing action against a DataSet.

DataSetCallback

dataStreamCallback (producer)

Function performing action against a DataStream.

DataStreamCallback

resolveProperty Placeholders (advanced)

Whether the component should resolve property placeholders on itself when starting. Only properties which are of String type can use property placeholders.

true

boolean

@Bean
public DataSetCallback<Long> dataSetCallback() {
    return new DataSetCallback<Long>() {
        public Long onDataSet(DataSet dataSet, Object... objects) {
            try {
                 dataSet.print();
                 return new Long(0);
            } catch (Exception e) {
                 return new Long(-1);
            }
        }
    };
}
@Bean
public VoidDataStreamCallback dataStreamCallback() {
    return new VoidDataStreamCallback() {
        @Override
        public void doOnDataStream(DataStream dataStream, Object... objects) throws Exception {
            dataStream.flatMap(new Splitter()).print();

            environment.execute("data stream test");
        }
    };
}
CamelContext camelContext = new SpringCamelContext(context);

String pattern = "foo";

try {
    ProducerTemplate template = camelContext.createProducerTemplate();
    camelContext.start();
    Long count = template.requestBody("flink:dataSet?dataSet=#myDataSet&dataSetCallback=#countLinesContaining", pattern, Long.class);
    } finally {
        camelContext.stop();
    }