---
title: Kafka
description: Learn how to connect and configure Kafka connections in Infoveave to produce and consume messages from real-time streaming data pipelines.
---

import { Steps } from '@astrojs/starlight/components';

# Kafka

Kafka connection helps Infoveave connect to Apache Kafka clusters and exchange streaming data for analytics, workflows, and real-time pipelines.

## Adding a Kafka Connection

To create a new Kafka connection:

<Steps>
1. Go to **Administration** > **Connections**
2. Click **New Connection** and select **Kafka** from the available connectors
3. In the **New Kafka Connection** form, provide the required configuration details
</Steps>

## Connection Parameters

When setting up a Kafka connection, you need to provide the following details:

| Field | Description |
|-------|-------------|
| **Name** | A unique name to identify your Kafka connection |
| **Bootstrap Servers** | The Kafka cluster bootstrap server list, for example: `broker1:9092, broker2:9092` |
| **Group ID** | The consumer group identifier used for message consumption |

## Authentication

Kafka clusters may require additional security configurations depending on your environment (SSL/SASL). Infoveave allows you to configure these parameters in advanced settings where applicable.

## Steps to Validate and Save

<Steps>
1. Enter your **Name**, **Bootstrap Servers**, and **Group ID**
2. Click **Validate** to verify the connection to your Kafka cluster
3. If validation succeeds, click **Save** to store the connection
4. You can now use this Kafka connection for producing or consuming messages in Infoveave workflows and streaming applications
</Steps>

## Example Use Cases

With a Kafka connection configured, you can:

- Consume streaming data in real time for analytics and dashboards  
- Produce messages to Kafka topics from Infoveave workflows  
- Build end-to-end data pipelines using Kafka and other Datasources  
- Combine Kafka event streams with historical data for enriched insights  
- Trigger workflows automatically based on incoming Kafka messages  
- Monitor and process high-volume, low-latency event streams  
