Forwarding to Elasticsearch
Prerequisites
Event forwards originate from region-specific IPs. For the full list of outbound IPs by region, see SaaS Regions and IP Ranges. Update your firewall and allow inbound requests from these IP addresses to enable Sysdig to handle event forwarding.
You must have an instance of Elasticsearch running and permissions to access it.
Datastreams are currently not supported. Make sure to configure your Elasticsearch index template with the “datastream” option set to off. That way, data will be stored on indices.
Configure Standard Integration
Log in to Sysdig Secure as Admin and go to Profile > Settings > Event Forwarding.
Click +Add Integration and choose Elasticsearch from the drop-down menu.
Configure the required options:
Integration Name: Define an integration name.
Endpoint: Enter the specific Elasticsearch instance where the data will be saved. For ES Cloud and ES Cloud Enterprise, the endpoint can be found under the Deployments page:
Index Name: Name of the index under which the data will be stored. See also: https://www.elastic.co/blog/what-is-an-elasticsearch-index
Authentication: Basic authentication is the most common format (
username:password
). The given user must havewrite
privileges in Elasticsearch; you can query the available users.Data to Send: Select from the drop-down the types of Sysdig data that should be forwarded. The available list depends on the Sysdig features and products you have enabled.
Allow insecure connections: Used to skip certificate validations when using HTTPS
Toggle the enable switch as necessary. Remember that you will need to “Test Integration” with the button below before enabling the integration.
Click Save.
Timestamp Mapping
To handle timestamps directly in Elasticsearch, you might want to map them to the appropriate field type. Timestamps have nanosecond resolution in Sysdig and they are available both in epoch timestamp and in RFC 3339 format.
The best approach is to use the date_nanos field type and define an explicit mapping in your Elasticsearch instance.
You will need to perform a PUT /<index>/_mapping
API call with the index into which you are storing data, using the following payload:
{
"properties": {
"timestampRFC3339Nano": {
"type": "date_nanos",
"format": "strict_date_optional_time_nanos"
}
}
}
If you use the Kibana interface, you can do it there instead.
Configure Agent Local Forwarding
Review the configuration steps and use the following parameters for this integration.
Type | Attribute | Required? | Type | Allowed values | Default | Description |
---|---|---|---|---|---|---|
ELASTIC | endpoint | yes | string | Elasticsearch instance endpoint URL | ||
ELASTIC | indexName | yes | string | Name of the index to store the data in | ||
ELASTIC | insecure | no | bool | false | Doesn’t verify TLS certificate | |
ELASTIC | auth | no | string | BASIC_AUTH,BEARER_TOKEN | Authentication method to use. Secret is required if this is specified | |
ELASTIC | secret | no | string | Encoded basic authentication or bearer token value |
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.