Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Application Integration: Integration over Kafka

Hello! 

Did you know that you can use Application Integration to easily manage and propagate your Kafka events to the cloud ?

Kafka is a distributed event store and stream-processing platform. This type of platform allows you for example to produce events in real time from you digital platforms where it is much more difficult in a point-to-point approach.

Your IT systems can then consume the events from the topic depending on their needs (ex : being aware of a user action or bug in the platform).

The only issue is that integrating IT Systems with Kafka can be complex because systems were not designed to subscribe to a topic.

This is where Application Integration comes to play : a simple and managed way to modernize Digital Experience using Kafka. Application Integration subscribes to the topic and propagates the right information in real time to IT Systems using APIs or connectors.

It also brings the mediation layer between Kafka and IT Systems.

Screenshot 2023-02-17 at 18.43.27.png

 

I made few demos with different IT systems that works pretty well :

Screenshot 2023-02-17 at 18.51.35.png

Here is a quick guide to start on your own :

  1. Install Kafka (it could be locally or in the cloud) 

Follow the Kafka quickstart to download Kafka, start the Kafka environment, and create a Kafka topic.

  • Create a Google Pub/Sub topic 

As Application Integration does not (yet) provide a Kafka trigger, you will need to use GCP Pub/Sub to subscribe from Kafka and trigger an event in Application Integration.

In the GCP Pub/Sub Service, create a topic with a default subscription.

 

  • Deploy the Pub/Sub connector for Kafka 

 

Uses the Java Pub/Sub Kafka Connector to be added to Kafka. For this demo you will need to configure only the CloudPubSubSinkConnector

Launch the 4 processes described here to publish a message on Kafka topic that will be automatically publish on Google Pub/Sub

  1. You did the most difficult part already ! Now you just need to create the Integration flow as you like, taking the event topic as input, and using, API or connectors as output to propagate the messages :

 

slebrun_0-1676663028545.png

 

  1. Create the Pub/Sub trigger with the topic name
  2. Create a Data mapping task to extract the “type” variable from Kafka message (json)
  3. Create an email task the sent the message content to your email
  4. Create a REST task to send the message by API (for example to an Apigee Endpoint)
  5. Use a ServiceNow connector instance and Create a new Incident
  6. Use a Salesforce connector instance and push a new event on the Platform Event defined

Here are few Kafka messages samples to test the flow on your end :

Mail over Kafka : {"Id":"1234","user":"user@domain.com","type":"email","message":"Customer asked to be called by sales"}

APIs over Kafka : {"Id":"12345","user":"user@domain.com","type":"api","message":"Profile changed to developer"}

Incidents over Kafka : {"Id":"123456","user":"user@domain.com","type":"incident","message":"Bug report on UI 504 error code"}

Salesforce over Kafka : {"Id":"1234567","user":"user@domain.com","type":"salesforce","message":"New eval created on Apigee"}

Have fun and let me know if you have any questions !

 

8 2 3,475
2 REPLIES 2

hi, thank you for your post. It helps a lot. Well, do we need any subscription here? Please help.

Hello ! You need access to Application Integration https://cloud.google.com/application-integration?hl=en

There is a Pay As you Go offer

Top Labels in this Space