/bulkexport/submit

Use the /bulkexport/submit endpoint to run a bulk export query.

The /bulkexport/submit endpoint request includes the Elasticsearch query that you want to run, and some additional information about the S3 storage location, file format and size, and file compression options.

Access to the /bulkexport/submit endpoint can be restricted to control the S3 destinations that are permitted. A user submitting a bulk export query must have an appropriate chaos:query:export:submit permission for the target S3 destination in addition to any existing permissions required for the query itself. The /bulkexport endpoints use the AWS Sig4 algorithm.

Request

A sample submit request follows. The fields are described after the example.

POST /bulkexport/submit
{
  "params": { ... your Elasticsearch query ... },
  "exportOptions": {
    "queryName": "optional name",
    "compression": "NONE" | "GZIP",
    "destination": "{BUCKET NAME}/{S3 PATH PREFIX}",
    "unitFileSize": 100,
    "fileFormat": "JSON" | "CSV"
  },
  "type" : "elastic"
}

Description of Fields

  • params – Type or paste your Elasticsearch query in this field.
  • exportOptions – The options include the following fields:
    • queryName – An optional name for the export that appears in the Bulk Export UI menu.
    • compression – Specify the compression to use for the exported JSON files, such as GZIP (default) or None.
    • destination – Specify the identifier of the S3 bucket where you want to store the exported result set files.
    • unitFileSize – Specify a maximum file size for the resulting files. The default is 100 MB.
    • fileFormat – Specify the format of the exported files as JSON (default) or CSV.
  • type – Set the type of bulk query export to elastic.

Request Example

A sample curl command to create a bulk export request follows. This example shows values for the AWS region, and the ChaosSearch API key for authentication, and a query:

curl 'https://mycompany.chaossearch.com/bulkexport/submit' \
--aws-sigv4 "aws:amz:<region>:s3" --user "<API_Key>:<Secret>" \
-H 'Content-Type: application/json' \
--data-raw '{"params":{"index":"my-chaos-view","body":{"aggs": {"maxSizeField": 
    {"max":{ "field": "tripduration"} } },"size": 0, "query": { "range": {"timestamp": 
    {"format": "strict_date_optional_time", "gt": "2023-11-21 23:59:00.000Z", 
    "lt": "2023-11-23 00:00:00.000Z" }}}}}, "exportOptions":{"queryName":"my-export-name",
    "compression":"GZIP","destination":"my-s3-export-bkt/exports","unitFileSize":100,
    "fileFormat":"JSON"},"type":"elastic"}'

Response

The response for a successful submit request returns an identifier that can be used with the /bulkexport/status endpoint to poll for progress and completion status.

{  
  "id": "{TRACKING UID}"  
}

Response Example

{"id":"958cebcb-0aa5-45e2-bc0f-45f1980a8cb0"}%   

RBAC Controls for Submit

To submit a bulk export request, your account must have permission to use the bulk export feature itself, and must have permission to read and write to the designated S3 bucket in your customer account.

{  
  "Actions": [  
    "chaos:query:export:submit"  
  ],  
  "Effect": "Allow",  
  "Resources": [  
    "arn:aws:s3:::DOC-EXAMPLE-DESTINATION-BUCKET/some/prefix/*"  
  ],  
  "Version": "1.0"  
}