SAP HANA smart data integration and SAP HANA smart data quality load data, in batch or real-time, into HANA (on premise or in the cloud) from a variety of sources using pre-built and custom adapters. The Kafka adapter is an example of a custom adapter made available by Advantco. Apache Kafka is an open-source distributed event streaming platform for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. In this blog, we discuss a solution to consume data from a KAFKA topic and to publish data to a KAFKA topic.
February 17, 2021
February 08, 2021
SAP has released new OEM adapters for Salesforce, Dynamics 365 CRM and Amazon Web Services. Migrating the Advantco OEM adapters to the new OEM adapters is not possible because the new adapters are lacking many essential features.
June 02, 2020
A typical use case is where an external application uploads a file to a S3 bucket. Once the file upload is complete, Amazon S3 will generate a event notification. This notification can be forward to an Amazon SQS queue, an Amazon SNS topic or something else. In this blog, we describe the steps required to handle the S3 – SQS event notification. Once the event is published to SQS queue, SAP PO will use the Advantco AWS adapter to pull the event message from SQS. The event message has all the information required for us to query the S3 file. We use Async – Sync bridge feature in this scenario to return the file for downstream processing.
May 01, 2020
There many use cases where one wants to have the data encrypted before it is written to a file or send to an external application or integration platform. This blog describes one solution by using the Advantco PGP Webservice to encrypt/decrypt data in SAP ECC or SAP S/4HANA.
March 30, 2020
There are many databases on the market, but most of them are categorized into three data models relational, document, and graph. Data models and characteristics of the application to store and retrieve data are the factors to choose databases, and we do not compare which one is better. Very clear that the document model is the best for use cases where data is self-contained documents and no relationships of many-to-one or many-to-many between documents. The graph model is the opposite, and it targets to use cases where the data relationship is more complicated.
March 10, 2020
Doing data integration with Salesforce is not simple. We have to understand API objects which are analogous to database tables and KSQL to extract data from Salesforce, and different types of API set and when to use appropriately, and the concept of External Id… and limit of API calls.
February 26, 2020
Confluent Schema Registry stores Avro schemas for Kafka producer and consumer so that producers write data with a schema that can be read by consumers even as producers and consumers evolve their using schema.
February 17, 2020
Neo4j is a graph database designed to treat the relationships between data as equally important to the data itself. Neo4j natively supports relationships is able to store, process, and query connections efficiently. While other databases compute relationships at query time through expensive JOIN operations, a graph database stores connections alongside the data in the model.
December 16, 2019
Apache Pulsar is a distributed open source Publication/Subscription based messaging system developed at Yahoo. Pulsar is considered as an alternative to Kafka because of the key features below. Advantco releases an adapter to enable out of the box integration with a Pulsar broker.
October 22, 2019
Advantco is excited to announce that an international non-profit organization has chosen our Salesforce Adapter to meet their technology integration requirements.
October 03, 2019
When consuming grpc services, SAP PO acts like gRPC a client application calling methods on a server application on a different machine. The server application can be a internal developed microservice or a cloud service like Google Pub/Sub.
September 24, 2019
There are use cases where one has to publish huge volume of data from SAP HANA to Salesforce, this blog describes options that are available with the Advantco Salesforce adapter for SDI. We describe how to use virtual tables to upsert (update/create) standard or custom objects in Salesforce. We will also handle platform events in case of more complex use cases where one has parent and child objects.