Forwarding to Kafka Topic
Events are organized and durably stored in topics. In simple terms, a topic is like a folder in a filesystem, and the events are the files in that folder.
Prerequisites
This integration is only for Sysdig On-Premises.
Configure Standard Integration
To forward secure data to Kafka:
Log in to Sysdig Secure On-Prem as Admin.
From the user menu on the bottom left, go to Settings > Event Forwarding.
Click +Add Integration and choose Kafka topic from the drop-down.
Configure the required options:
Integration Name: Define an integration name.
Brokers: Kafka server endpoints. A Kafka cluster may provide several brokers; it follows the “hostname: port” (without protocol scheme). You can list several using a comma-separated list.
Topic: Kafka topic where you want to store the forwarded data
Partitioner/Balancer: Algorithm that the client uses to multiplex data between the multiple Brokers. For compatibility with the Java client, Murmur2 is used as the default partitioner. Supported algorithms are:
Murmur2
Round robin
Least bytes
Hash
CRC32
Compression: Compression standard used for the data. Supported algorithms are:
LZ4
Snappy
Gzip
Standard
Authentication: The authentication method to adopt. Supported methods are:
None
Kerberos (GSSAPI). If you select this, you must provide the:
- Principal
- Realm
- Service
And the following files:
- Keytab
- krb5.conf
SASL/PLAIN. If you select this, you must provide:
- Username
- Password
SASL/SCRAM. If you select this, you must provide:
- Algorithm, choosing between SHA-256 and SHA-512
- Username
- Password
Data to Send: Select from the drop-down the types of Sysdig data that should be forwarded. The available list depends on the Sysdig features and products you have enabled.
Select whether or not you want to allow insecure connections. Insecure connections have invalid or self-signed certificate on the receiving side.
Select Test Integration.
Toggle the enable switch as necessary
Click Save.
Configure Agent Local Forwarding
Review the configuration steps and use the following parameters for this integration.
Type | Attribute | Required? | Type | Allowed values | Default | Description |
---|---|---|---|---|---|---|
KAFKA | brokers | yes | string | |||
KAFKA | topic | yes | string | |||
KAFKA | compression | no | string | lz4, snappy, zstd, gzip | ||
KAFKA | balancer | no | string | roundrobin, leastbytes, hash, crc32, murmur2 | murmur2 | |
KAFKA | tls | no | bool | false | ||
KAFKA | insecure | no | bool | false | Doesn’t verify TLS certificate | |
KAFKA | auth | no | string | gssapi | The authentication method to optionally use. Currently supporting only GSSAPI | |
KAFKA | principal | no | string | GSSAPI principal. Required is GSSAPI authentication is selected | ||
KAFKA | realm | no | string | GSSAPI realm. Required is GSSAPI authentication is selected | ||
KAFKA | service | no | string | GSSAPI Service name. Required is GSSAPI authentication is selected | ||
KAFKA | keytab | no | string | base64 encoded Kerberos keytab for GSSAPI. Required is GSSAPI authentication is selected | ||
KAFKA | krb5 | no | string | Kerberos krb5.conf file content for GSSAPI. Required is GSSAPI authentication is selected | ||
KAFKA | algorithm | no | string | sha-256, sha-512 | SASL/SCRAM hashing algorithm | |
KAFKA | username | no | string | SASL username | ||
KAFKA | password | no | string | SASL password |
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.