Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Introduction

SAP HANA smart data integration and SAP HANA smart data quality load data, in batch or real-time, into HANA (on premise or in the cloud) from a variety of sources using pre-built and custom adapters. The Kafka adapter is an example of a custom adapter made available by Advantco.

Apache Kafka is an open-source distributed event streaming platform for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.

In this blog, we discuss a solution to consume data from a KAFKA topic and to publish data to a KAFKA topic.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

KAFKA topic setup

We create a topic on the KAFKA broker with two partitions. We publish a simplified DEBMAS to this topic. This topic has two partitions.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

An Avro schema, defining the structure of the data, is associated for the topic value. This schema will define the table schema in HANA.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

The Kafka adapter installation.

The Kafka adapter is deployed in the hosting environment where the Provision Agent was installed. After the deployment, we register the Kafka adapter to the HANA server which can be on premise or in the cloud. Note that each agent has an Agent Name.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

HANA Development Workbench.

In this blog, we use the HANA Cloud instance. We use the HANA Workbench to define the connection to KAFKA broker by creating a Remote Source. We will also use the Workbench to create a replication task to consume data from a KAFKA topic into a real HANA table.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

KAFKA Remote Source.

The Remote Source uses the registered KAFKA adapter that we deployed on the Provision Agent to create a connection to the KAFKA broker. To create the Remote Source, select the KAFKA adapter and the Agent Name. Specify the KAFKA broker hosts and ports, configure the parameters and do the connection test.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Virtual Table

Once the Remote Source is created, you should see the topics that were defined on the KAFKA broker. You can create a Virtual Table by right-clicking on a topic.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

The Virtual Table schema is based on the Avro schema defined for the topic on the KAFKA broker side.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

The Virtual Table is a direct connection to a topic, when querying the Virtual Table, you act like a KAFKA consumer. When inserting into the Virtual Table, you act like a producer.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Replication Task

In this step, we will create a task to replicate data from the KAFKA topic into a Real Table in HANA. This is done in the Editor section of the HANA workbench.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

We select the Remote Source we defined above, specify the Target Schema and click on Add Objects to choose one or more topics that we want to replicate.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Once the topic is added, you can edit the column properties if needed.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Executing the task to start the replication task. You can monitor the progress with the Data Provisioning Design Time Object Monitor.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

On the KAFKA broker side, you can see that the messages are being consumed.

Advantco KAFKA adapter for SAP HANA Smart Data Integration (SDI)

Conclusion

The KAFKA adapter for SAP HANA SDI (Smart Data Integration) provides a very convenient way to replicate data from a KAFKA topic to a HANA table. The KAFKA adapter also supports publishing data from HANA to a KAFKA broker.

Sources:

https://kafka.apache.org/

https://help.sap.com/viewer/product/HANA_SMART_DATA_INTEGRATION/2.0_SPS05/en-US?task=discover_task

https://advantco.com/


Please reach out to our sales team at sales@advantco.com if you have any questions.