Live Workshop: Integrate Google SecOps with Bindplane - Join Us on January 29th at 11 AM ET!Sign Up Now

Splunk Search API

Description

The Splunk Search API source uses the Splunk Search API to collect past events.

Supported Platforms

PlatformMetricsLogsTraces
Linux
Windows
macOS

Prerequisites

  • Splunk admin credentials

Use Case

Unlike other sources, the SSAPI source is not built to stream live data. Instead, it collects a finite set of event data and transfers it to a destination, preserving the timestamp from the original Splunk event.

Note: Once the source has started collecting events for a search, allow it to complete unless it is absolutely necessary to stop progress. If a search must be interrupted, run that search again to pick up where progress was interrupted. If you switch to a different search instead, the checkpoint for the search will be wiped from storage.

Configuration Table

ParameterTypeDefaultDescription
hostnamestringSplunk search head hostname.
portstring8089Splunk instance endpoint port.
auth_modeenum"basic"Authentication mode to use when connecting to the Splunk REST API. Valid values are "basic" and "token".
usernamestringUsername used to authenticate to the Splunk REST API.
passwordstringPassword used to authenticate to the Splunk REST API.
auth_tokenstringAuth token used to authenticate to the Splunk REST API.
token_typeenumType of token used to authenticate to the Splunk REST API. Valid values are "Bearer" and "Splunk".
job_poll_intervalint5How many seconds to wait between polling for search job completion.
searches.querystringSplunk search to run to retrieve the desired events. Queries must start with search and should not contain additional commands, nor any time fields (e.g. earliesttime).
searches.earliest_timestringEarliest timestamp to collect logs (inclusive).
searches.latest_timestringLatest timestamp to collect logs (inclusive).
searches.event_batch_sizeint100Amount of events to query from Splunk for a single request.
enable_tlsbooltrueWhether or not to use TLS.
tls_certificate_pathstringPath to the TLS cert to use for TLS-required connections.
tls_private_key_pathstringPath to the TLS key to use for TLS-required connections.
enable_storagebooltrueWhether or not to use a storage extension. Should be enabled in all environments.
storage_directorystring$OIQ_OTEL_COLLECTOR_HOME/storageThe directory where the storage file will be created.

Configuration Instructions

  1. Identify the Splunk index to migrate events from. Create a Splunk search to capture the events from that index. This will be the query you pass to the source.
  • Example: search index=my_index
  • Note: queries must begin with the explicit search command, and must not include additional commands, nor any time fields (e.g. earliesttime)
  1. Determine the timeframe you want to migrate events from, and set the ‘Earliest Time’ and ‘Latest Time’ config fields accordingly.
  • To migrate events from December 2024, EST (UTC-5):
    • Earliest Time: "2024-12-01T05:00"
    • Latest Time: "2025-01-01T04:59”
  • Note: By default, GCL will not accept logs with a timestamp older than 30 days. Contact Google to modify this rule.
  1. Repeat steps 1 & 2 for each index you wish to collect from
  • This is not a requirement. The receiver can migrate multiple searches at once, but doing one at a time will allow for easier testing and debugging of the source.
  1. Configure the rest of the source fields according to your Splunk environment.
observIQ docs - Splunk Search API - image 1
observIQ docs - Splunk Search API - image 2