Pushing Logs to Google Cloud Storage
Review this summary of some common third-party tools that are used by ChaosSearch users to push logs to GCP Cloud Storage.
Supported Tools
The following sections describe some common, supported, third-party tools with sample configuration files.
NOTE
This topic contains examples that could be helpful for your environment. Review these examples carefully because some configuration and special cases might be required to meet your specific use-case. Links to documentation are provided before each example.
FluentD
FluentD is an open-source data collector that can help to unify data collection and consumption. It attempts to unify the file formats by structuring data in JSON format as much as possible. More information is available at:
https://github.com/GoogleCloudPlatform/fluent-plugin-google-cloud
https://cloud.google.com/logging/docs/agent/logging/configuration#cloud-fluentd-config
A sample FluentD configuration file for GCP Cloud Storage and a ChaosSearch implementation follows:
<match pattern>
@type google_cloud
</match>
Cloudflare Logs via LogPusher
Cloudflare Enterprise provides very detailed logs to customers for all types of troubleshooting and investigation. Typically Cloudflare Enterprise users push their request or event logs to their cloud service storage using Logpush, which can be configured from the Cloudflare dashboard or API. More information is available at:
https://developers.cloudflare.com/logs/analytics-integrations/google-cloud
Fastly - Log Streaming
Fastly has a Log Streaming feature that allows you to automatically save logs to a third-party service for storage and analysis. More information is available at:
https://docs.fastly.com/en/guides/log-streaming-google-cloud-storage
Vector
Vector is a tool that can collect, transform, and route your logs and metrics to your cloud storage. More information is available at:
https://vector.dev/docs/reference/configuration/sinks/gcp_cloud_storage/
https://vector.dev/guides/integrate/sinks/gcp_cloud_storage/
A sample Vector configuration file for pushing logs to GCP Cloud Storage follows:
[sinks.my_sink_id]
# REQUIRED - General
type = "gcp_cloud_storage"
inputs = ["my-source-id"]
bucket = "my-bucket"
# OPTIONAL - General
healthcheck = true # default
hostname = "127.0.0.0:5000"
# OPTIONAL - Batching
batch_size = 10490000 # default, bytes
batch_timeout = 300 # default, seconds
# OPTIONAL - Object Names
filename_append_uuid = true # default
filename_extension = "log" # default
filename_time_format = "%s" # default
key_prefix = "date=%F/"
# OPTIONAL - Requests
compression = "gzip" # no default, must be: "gzip" (if supplied)
encoding = "ndjson" # no default, enum: "ndjson" or "text"
gzip = false # default
rate_limit_duration = 1 # default, seconds
rate_limit_num = 5 # default
request_in_flight_limit = 5 # default
request_timeout_secs = 30 # default, seconds
retry_attempts = 5 # default
retry_backoff_secs = 5 # default, seconds
# OPTIONAL - Buffer
[sinks.my_sink_id.buffer]
type = "memory" # default, enum: "memory" or "disk"
when_full = "block" # default, enum: "block" or "drop_newest"
max_size = 104900000 # no default, bytes, relevant when type = "disk"
num_items = 500 # default, events, relevant when type = "memory"
Updated 8 months ago