Pushing Logs to Google Cloud Storage

This guide will provide different tools and examples of how ChaosSearch Users are pushing logs to S3

Supported Tools:

❗️

Please note, the below are examples and should be reviewed and configured to meet your specific use case. Links to documentation are provided before each example.

Logstash

https://www.elastic.co/guide/en/logstash/current/plugins-outputs-google_cloud_storage.html

input{
   file{
      path => "/Users/chaossearch/*"
      start_position => beginning
   }
}
filter{
dissect { mapping => { "message" => "%{} %{} %{}  %{y} %{} %{} %{}]:%{restOfLine}" } }
    json { source => "restOfLine" }
    mutate {
        remove_field => [ "message", "restOfLine" ]
}
}

output{ 
stdout { codec => rubydebug }
google_cloud_storage {
     bucket => "my_bucket"                                    
     json_key_file => "/path/to/privatekey.json"               
     temp_directory => "/tmp/logstash-gcs"                     
     log_file_prefix => "logstash_gcs"                         
     max_file_size_kbytes => 1024                              
     output_format => "json"                                 
     date_pattern => "%Y-%m-%dT%H:00"                         
     flush_interval_secs => 2                                  
     gzip => false                                             
     gzip_content_encoding => false                            
     uploader_interval_secs => 60                              
     include_uuid => true                                     
     include_hostname => true                                 
   }
}

FluentD

https://github.com/GoogleCloudPlatform/fluent-plugin-google-cloud
https://cloud.google.com/logging/docs/agent/logging/configuration#cloud-fluentd-config

<match pattern>
  @type google_cloud
</match>

CloudFlare Logs via LogPusher

https://developers.cloudflare.com/logs/analytics-integrations/google-cloud

Fastly - Log Streaming

https://docs.fastly.com/en/guides/log-streaming-google-cloud-storage

Vector

https://vector.dev/docs/reference/configuration/sinks/gcp_cloud_storage/
https://vector.dev/guides/integrate/sinks/gcp_cloud_storage/

[sinks.my_sink_id]
  # REQUIRED - General
  type = "gcp_cloud_storage"
  inputs = ["my-source-id"]
  bucket = "my-bucket"

  # OPTIONAL - General
  healthcheck = true # default
  hostname = "127.0.0.0:5000"

  # OPTIONAL - Batching
  batch_size = 10490000 # default, bytes
  batch_timeout = 300 # default, seconds

  # OPTIONAL - Object Names
  filename_append_uuid = true # default
  filename_extension = "log" # default
  filename_time_format = "%s" # default
  key_prefix = "date=%F/"

  # OPTIONAL - Requests
  compression = "gzip" # no default, must be: "gzip" (if supplied)
  encoding = "ndjson" # no default, enum: "ndjson" or "text"
  gzip = false # default
  rate_limit_duration = 1 # default, seconds
  rate_limit_num = 5 # default
  request_in_flight_limit = 5 # default
  request_timeout_secs = 30 # default, seconds
  retry_attempts = 5 # default
  retry_backoff_secs = 5 # default, seconds

  # OPTIONAL - Buffer
  [sinks.my_sink_id.buffer]
    type = "memory" # default, enum: "memory" or "disk"
    when_full = "block" # default, enum: "block" or "drop_newest"
    max_size = 104900000 # no default, bytes, relevant when type = "disk"
    num_items = 500 # default, events, relevant when type = "memory"

Updated 2 months ago


Pushing Logs to Google Cloud Storage


This guide will provide different tools and examples of how ChaosSearch Users are pushing logs to S3

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.